Dec 16 12:46:53 crc systemd[1]: Starting Kubernetes Kubelet... Dec 16 12:46:53 crc restorecon[4588]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:53 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 12:46:54 crc restorecon[4588]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 12:46:54 crc restorecon[4588]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 16 12:46:54 crc kubenswrapper[4757]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:46:54 crc kubenswrapper[4757]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 16 12:46:54 crc kubenswrapper[4757]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:46:54 crc kubenswrapper[4757]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:46:54 crc kubenswrapper[4757]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 16 12:46:54 crc kubenswrapper[4757]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.779855 4757 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785573 4757 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785611 4757 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785624 4757 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785633 4757 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785641 4757 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785654 4757 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785664 4757 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785675 4757 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785685 4757 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785694 4757 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785703 4757 feature_gate.go:330] unrecognized feature gate: Example Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785713 4757 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785721 4757 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785730 4757 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785738 4757 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785748 4757 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785757 4757 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785765 4757 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785774 4757 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785781 4757 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785789 4757 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785797 4757 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785830 4757 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785841 4757 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785850 4757 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785859 4757 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785867 4757 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785876 4757 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785884 4757 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785891 4757 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785899 4757 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785907 4757 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785917 4757 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785925 4757 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785932 4757 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785940 4757 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785948 4757 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785956 4757 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785964 4757 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785971 4757 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785979 4757 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785987 4757 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.785995 4757 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786080 4757 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786091 4757 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786101 4757 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786110 4757 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786119 4757 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786127 4757 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786135 4757 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786143 4757 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786151 4757 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786159 4757 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786166 4757 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786174 4757 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786183 4757 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786190 4757 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786198 4757 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786208 4757 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786216 4757 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786223 4757 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786231 4757 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786238 4757 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786246 4757 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786253 4757 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786261 4757 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786269 4757 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786276 4757 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786284 4757 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786291 4757 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.786299 4757 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786643 4757 flags.go:64] FLAG: --address="0.0.0.0" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786662 4757 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786678 4757 flags.go:64] FLAG: --anonymous-auth="true" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786689 4757 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786701 4757 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786710 4757 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786721 4757 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786733 4757 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786742 4757 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786752 4757 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786761 4757 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786772 4757 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786781 4757 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786791 4757 flags.go:64] FLAG: --cgroup-root="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786800 4757 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786809 4757 flags.go:64] FLAG: --client-ca-file="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786818 4757 flags.go:64] FLAG: --cloud-config="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786827 4757 flags.go:64] FLAG: --cloud-provider="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786837 4757 flags.go:64] FLAG: --cluster-dns="[]" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786847 4757 flags.go:64] FLAG: --cluster-domain="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786855 4757 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786864 4757 flags.go:64] FLAG: --config-dir="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786873 4757 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786883 4757 flags.go:64] FLAG: --container-log-max-files="5" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786895 4757 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786904 4757 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786913 4757 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786922 4757 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786932 4757 flags.go:64] FLAG: --contention-profiling="false" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786941 4757 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786949 4757 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786959 4757 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786967 4757 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786978 4757 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786987 4757 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.786996 4757 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787034 4757 flags.go:64] FLAG: --enable-load-reader="false" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787044 4757 flags.go:64] FLAG: --enable-server="true" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787052 4757 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787065 4757 flags.go:64] FLAG: --event-burst="100" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787074 4757 flags.go:64] FLAG: --event-qps="50" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787083 4757 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787092 4757 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787101 4757 flags.go:64] FLAG: --eviction-hard="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787112 4757 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787121 4757 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787131 4757 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787141 4757 flags.go:64] FLAG: --eviction-soft="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787151 4757 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787160 4757 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787169 4757 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787178 4757 flags.go:64] FLAG: --experimental-mounter-path="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787187 4757 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787196 4757 flags.go:64] FLAG: --fail-swap-on="true" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787205 4757 flags.go:64] FLAG: --feature-gates="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787215 4757 flags.go:64] FLAG: --file-check-frequency="20s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787224 4757 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787233 4757 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787242 4757 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787251 4757 flags.go:64] FLAG: --healthz-port="10248" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787260 4757 flags.go:64] FLAG: --help="false" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787269 4757 flags.go:64] FLAG: --hostname-override="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787278 4757 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787287 4757 flags.go:64] FLAG: --http-check-frequency="20s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787296 4757 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787305 4757 flags.go:64] FLAG: --image-credential-provider-config="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787314 4757 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787322 4757 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787331 4757 flags.go:64] FLAG: --image-service-endpoint="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787340 4757 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787349 4757 flags.go:64] FLAG: --kube-api-burst="100" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787358 4757 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787367 4757 flags.go:64] FLAG: --kube-api-qps="50" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787376 4757 flags.go:64] FLAG: --kube-reserved="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787385 4757 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787394 4757 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787403 4757 flags.go:64] FLAG: --kubelet-cgroups="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787412 4757 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787422 4757 flags.go:64] FLAG: --lock-file="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787431 4757 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787440 4757 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787450 4757 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787463 4757 flags.go:64] FLAG: --log-json-split-stream="false" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787473 4757 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787482 4757 flags.go:64] FLAG: --log-text-split-stream="false" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787490 4757 flags.go:64] FLAG: --logging-format="text" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787500 4757 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787509 4757 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787518 4757 flags.go:64] FLAG: --manifest-url="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787544 4757 flags.go:64] FLAG: --manifest-url-header="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787556 4757 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787565 4757 flags.go:64] FLAG: --max-open-files="1000000" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787576 4757 flags.go:64] FLAG: --max-pods="110" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787585 4757 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787594 4757 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787603 4757 flags.go:64] FLAG: --memory-manager-policy="None" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787612 4757 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787622 4757 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787632 4757 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787642 4757 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787661 4757 flags.go:64] FLAG: --node-status-max-images="50" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787670 4757 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787679 4757 flags.go:64] FLAG: --oom-score-adj="-999" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787688 4757 flags.go:64] FLAG: --pod-cidr="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787696 4757 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787709 4757 flags.go:64] FLAG: --pod-manifest-path="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787719 4757 flags.go:64] FLAG: --pod-max-pids="-1" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787728 4757 flags.go:64] FLAG: --pods-per-core="0" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787737 4757 flags.go:64] FLAG: --port="10250" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787746 4757 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787755 4757 flags.go:64] FLAG: --provider-id="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787765 4757 flags.go:64] FLAG: --qos-reserved="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787774 4757 flags.go:64] FLAG: --read-only-port="10255" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787783 4757 flags.go:64] FLAG: --register-node="true" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787792 4757 flags.go:64] FLAG: --register-schedulable="true" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787801 4757 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787820 4757 flags.go:64] FLAG: --registry-burst="10" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787829 4757 flags.go:64] FLAG: --registry-qps="5" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787838 4757 flags.go:64] FLAG: --reserved-cpus="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787847 4757 flags.go:64] FLAG: --reserved-memory="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787859 4757 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787868 4757 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787877 4757 flags.go:64] FLAG: --rotate-certificates="false" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787886 4757 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787894 4757 flags.go:64] FLAG: --runonce="false" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787903 4757 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787912 4757 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787922 4757 flags.go:64] FLAG: --seccomp-default="false" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787931 4757 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787940 4757 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787949 4757 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787958 4757 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787967 4757 flags.go:64] FLAG: --storage-driver-password="root" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787976 4757 flags.go:64] FLAG: --storage-driver-secure="false" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787985 4757 flags.go:64] FLAG: --storage-driver-table="stats" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.787993 4757 flags.go:64] FLAG: --storage-driver-user="root" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.788025 4757 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.788035 4757 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.788044 4757 flags.go:64] FLAG: --system-cgroups="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.788053 4757 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.788067 4757 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.788075 4757 flags.go:64] FLAG: --tls-cert-file="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.788084 4757 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.788095 4757 flags.go:64] FLAG: --tls-min-version="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.788104 4757 flags.go:64] FLAG: --tls-private-key-file="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.788113 4757 flags.go:64] FLAG: --topology-manager-policy="none" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.788121 4757 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.788130 4757 flags.go:64] FLAG: --topology-manager-scope="container" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.788139 4757 flags.go:64] FLAG: --v="2" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.788150 4757 flags.go:64] FLAG: --version="false" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.788161 4757 flags.go:64] FLAG: --vmodule="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.788171 4757 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.788181 4757 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788382 4757 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788391 4757 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788402 4757 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788410 4757 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788418 4757 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788426 4757 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788435 4757 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788445 4757 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788456 4757 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788464 4757 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788473 4757 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788483 4757 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788491 4757 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788499 4757 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788508 4757 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788517 4757 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788526 4757 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788534 4757 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788541 4757 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788549 4757 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788557 4757 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788565 4757 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788573 4757 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788581 4757 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788588 4757 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788596 4757 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788604 4757 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788611 4757 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788619 4757 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788626 4757 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788634 4757 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788642 4757 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788650 4757 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788657 4757 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788665 4757 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788672 4757 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788680 4757 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788688 4757 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788696 4757 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788706 4757 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788716 4757 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788726 4757 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788734 4757 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788747 4757 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788756 4757 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788766 4757 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788775 4757 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788784 4757 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788791 4757 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788799 4757 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788807 4757 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788814 4757 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788822 4757 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788830 4757 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788838 4757 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788845 4757 feature_gate.go:330] unrecognized feature gate: Example Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788853 4757 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788863 4757 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788873 4757 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788881 4757 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788890 4757 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788898 4757 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788906 4757 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788914 4757 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788922 4757 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788930 4757 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788937 4757 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788945 4757 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788953 4757 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788961 4757 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.788968 4757 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.789202 4757 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.797487 4757 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.797529 4757 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797717 4757 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797736 4757 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797745 4757 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797751 4757 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797757 4757 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797762 4757 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797769 4757 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797774 4757 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797780 4757 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797785 4757 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797790 4757 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797795 4757 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797801 4757 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797806 4757 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797811 4757 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797817 4757 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797823 4757 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797830 4757 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797836 4757 feature_gate.go:330] unrecognized feature gate: Example Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797841 4757 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797846 4757 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797852 4757 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797857 4757 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797863 4757 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797869 4757 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797874 4757 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797879 4757 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797884 4757 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797889 4757 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797894 4757 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797899 4757 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797904 4757 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797909 4757 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797914 4757 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797925 4757 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797930 4757 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797935 4757 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797940 4757 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797945 4757 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797949 4757 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797954 4757 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797959 4757 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797964 4757 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797969 4757 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797973 4757 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797978 4757 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797983 4757 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797988 4757 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797993 4757 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.797998 4757 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798151 4757 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798160 4757 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798165 4757 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798170 4757 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798175 4757 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798180 4757 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798185 4757 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798190 4757 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798195 4757 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798200 4757 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798207 4757 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798213 4757 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798218 4757 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798223 4757 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798228 4757 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798233 4757 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798238 4757 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798244 4757 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798249 4757 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798254 4757 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798268 4757 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.798277 4757 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798441 4757 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798449 4757 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798456 4757 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798461 4757 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798467 4757 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798473 4757 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798478 4757 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798485 4757 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798491 4757 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798497 4757 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798502 4757 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798508 4757 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798513 4757 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798518 4757 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798524 4757 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798530 4757 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798535 4757 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798541 4757 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798545 4757 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798550 4757 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798555 4757 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798559 4757 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798564 4757 feature_gate.go:330] unrecognized feature gate: Example Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798569 4757 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798574 4757 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798579 4757 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798584 4757 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798589 4757 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798594 4757 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798599 4757 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798603 4757 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798608 4757 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798613 4757 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798618 4757 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798624 4757 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798629 4757 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798633 4757 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798638 4757 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798643 4757 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798648 4757 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798653 4757 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798657 4757 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798662 4757 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798668 4757 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798675 4757 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798680 4757 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798686 4757 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798691 4757 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798697 4757 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798702 4757 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798708 4757 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798713 4757 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798719 4757 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798724 4757 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798730 4757 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798736 4757 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798741 4757 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798746 4757 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798750 4757 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798756 4757 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798760 4757 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798766 4757 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798770 4757 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798775 4757 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798780 4757 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798785 4757 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798791 4757 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798796 4757 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798800 4757 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798805 4757 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.798811 4757 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.798819 4757 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.799041 4757 server.go:940] "Client rotation is on, will bootstrap in background" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.802235 4757 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.802321 4757 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.803276 4757 server.go:997] "Starting client certificate rotation" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.803307 4757 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.804055 4757 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-04 02:48:11.112205104 +0000 UTC Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.804182 4757 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.808567 4757 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 16 12:46:54 crc kubenswrapper[4757]: E1216 12:46:54.810681 4757 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.811424 4757 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.821067 4757 log.go:25] "Validated CRI v1 runtime API" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.840191 4757 log.go:25] "Validated CRI v1 image API" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.841638 4757 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.843899 4757 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-16-12-41-21-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.843948 4757 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.856353 4757 manager.go:217] Machine: {Timestamp:2025-12-16 12:46:54.855170016 +0000 UTC m=+0.282913832 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2799998 MemoryCapacity:25199480832 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:56973bd2-6cf5-45e5-a4b6-0f6a651ea1df BootID:a609af24-e04e-486a-9383-84e6961dbf65 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ee:7e:80 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ee:7e:80 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:a6:f3:c1 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:5f:a5:ae Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:f6:ef:18 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:40:87:bb Speed:-1 Mtu:1496} {Name:eth10 MacAddress:b2:e4:59:16:4d:16 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:6a:4d:8e:74:fc:71 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199480832 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.856560 4757 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.856747 4757 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.857539 4757 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.857863 4757 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.857920 4757 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.858331 4757 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.858351 4757 container_manager_linux.go:303] "Creating device plugin manager" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.858696 4757 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.858747 4757 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.859178 4757 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.859342 4757 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.860556 4757 kubelet.go:418] "Attempting to sync node with API server" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.860588 4757 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.860625 4757 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.860773 4757 kubelet.go:324] "Adding apiserver pod source" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.860800 4757 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.863551 4757 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.864515 4757 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.864490 4757 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.864454 4757 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 16 12:46:54 crc kubenswrapper[4757]: E1216 12:46:54.864623 4757 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:46:54 crc kubenswrapper[4757]: E1216 12:46:54.864670 4757 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.866611 4757 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.867493 4757 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.867537 4757 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.867561 4757 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.867574 4757 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.867595 4757 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.867608 4757 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.867621 4757 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.867643 4757 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.867661 4757 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.867675 4757 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.867717 4757 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.867731 4757 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.868307 4757 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.868950 4757 server.go:1280] "Started kubelet" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.868961 4757 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.869763 4757 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.869774 4757 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.870207 4757 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 12:46:54 crc systemd[1]: Started Kubernetes Kubelet. Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.877899 4757 server.go:460] "Adding debug handlers to kubelet server" Dec 16 12:46:54 crc kubenswrapper[4757]: E1216 12:46:54.878763 4757 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.110:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1881b2e284f4a381 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-16 12:46:54.868923265 +0000 UTC m=+0.296667091,LastTimestamp:2025-12-16 12:46:54.868923265 +0000 UTC m=+0.296667091,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.882947 4757 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.883075 4757 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.883243 4757 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 08:20:37.593114855 +0000 UTC Dec 16 12:46:54 crc kubenswrapper[4757]: E1216 12:46:54.883404 4757 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.883571 4757 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.883589 4757 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.883799 4757 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 12:46:54 crc kubenswrapper[4757]: E1216 12:46:54.885901 4757 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="200ms" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.885954 4757 factory.go:55] Registering systemd factory Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.885974 4757 factory.go:221] Registration of the systemd container factory successfully Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.886172 4757 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 16 12:46:54 crc kubenswrapper[4757]: E1216 12:46:54.886235 4757 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.887475 4757 factory.go:153] Registering CRI-O factory Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.887492 4757 factory.go:221] Registration of the crio container factory successfully Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.887549 4757 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.887567 4757 factory.go:103] Registering Raw factory Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.887581 4757 manager.go:1196] Started watching for new ooms in manager Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.888180 4757 manager.go:319] Starting recovery of all containers Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892189 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892253 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892265 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892279 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892289 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892301 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892311 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892326 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892338 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892348 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892358 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892366 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892375 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892397 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892406 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892415 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892423 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892433 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892442 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892454 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892464 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892475 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892486 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892498 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892509 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892519 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892533 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892569 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892581 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892592 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892601 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892611 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892624 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892635 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.892647 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.896078 4757 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.896278 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897068 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897101 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897117 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897132 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897147 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897161 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897174 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897190 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897206 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897218 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897231 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897247 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897260 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897277 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897309 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897326 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897354 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897375 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897392 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897410 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897429 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897445 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897461 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897476 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897492 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897509 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897524 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897540 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897556 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897572 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897588 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897603 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897619 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897634 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897649 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897664 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897678 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897692 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897708 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897722 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897738 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897751 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897766 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897781 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897794 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897809 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897824 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897839 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897852 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897866 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897883 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897898 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897913 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897926 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897944 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897957 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.897972 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898047 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898062 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898078 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898094 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898108 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898124 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898139 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898154 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898169 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898185 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898205 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898227 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898269 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898286 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898323 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898339 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898354 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898368 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898382 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898396 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898412 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898426 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898441 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898454 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898466 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898481 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898494 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898526 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898541 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898556 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898567 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898582 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898594 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898616 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898629 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898642 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898656 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898669 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898682 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898695 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898710 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898723 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898738 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898754 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898765 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898779 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898792 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898806 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898821 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898835 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898847 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898860 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898873 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898888 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898903 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898916 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898930 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898942 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898956 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898968 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898981 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.898995 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899024 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899040 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899054 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899066 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899079 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899111 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899125 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899139 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899152 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899167 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899180 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899194 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899211 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899225 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899239 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899252 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899266 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899281 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899297 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899311 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899328 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899340 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899356 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899369 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899384 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899397 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899411 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899424 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899438 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899450 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899465 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899479 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899494 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899510 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899524 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899537 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899551 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899565 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899577 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899590 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899603 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899615 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899627 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899641 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899654 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899667 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899679 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899693 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899706 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899720 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899732 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899743 4757 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899754 4757 reconstruct.go:97] "Volume reconstruction finished" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.899763 4757 reconciler.go:26] "Reconciler: start to sync state" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.909321 4757 manager.go:324] Recovery completed Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.924072 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.926499 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.926558 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.926572 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.927944 4757 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.927969 4757 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.927993 4757 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.937172 4757 policy_none.go:49] "None policy: Start" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.938909 4757 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.938944 4757 state_mem.go:35] "Initializing new in-memory state store" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.945461 4757 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.947561 4757 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.947657 4757 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.947696 4757 kubelet.go:2335] "Starting kubelet main sync loop" Dec 16 12:46:54 crc kubenswrapper[4757]: E1216 12:46:54.947784 4757 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 12:46:54 crc kubenswrapper[4757]: W1216 12:46:54.948735 4757 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 16 12:46:54 crc kubenswrapper[4757]: E1216 12:46:54.948826 4757 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:46:54 crc kubenswrapper[4757]: E1216 12:46:54.983790 4757 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.998410 4757 manager.go:334] "Starting Device Plugin manager" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.998460 4757 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.998473 4757 server.go:79] "Starting device plugin registration server" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.998917 4757 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.998937 4757 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.999063 4757 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.999165 4757 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 16 12:46:54 crc kubenswrapper[4757]: I1216 12:46:54.999180 4757 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 12:46:55 crc kubenswrapper[4757]: E1216 12:46:55.009757 4757 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.048672 4757 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.049068 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.049971 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.050029 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.050039 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.050182 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.050318 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.050402 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.051033 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.051061 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.051070 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.051166 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.051226 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.051286 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.051209 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.051313 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.051293 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.052269 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.052288 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.052295 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.052308 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.052334 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.052345 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.052382 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.052637 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.052711 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.053338 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.053359 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.053368 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.053489 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.053625 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.053671 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.053915 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.053995 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.054037 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.054074 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.054091 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.054099 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.054244 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.054270 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.054601 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.054647 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.054662 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.054960 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.054986 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.054996 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:55 crc kubenswrapper[4757]: E1216 12:46:55.087553 4757 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="400ms" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.099982 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.101257 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.101296 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.101305 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.101328 4757 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 12:46:55 crc kubenswrapper[4757]: E1216 12:46:55.101840 4757 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.103940 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.103993 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.104034 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.104067 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.104129 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.104162 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.104187 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.104202 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.104222 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.104237 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.104268 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.104283 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.104296 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.104310 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.104323 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.205622 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.205703 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.205752 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.205779 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.205802 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.205845 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.205866 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.205922 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.205938 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206057 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206083 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206101 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206137 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.205982 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206167 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206185 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206203 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206213 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206292 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206332 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206360 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206327 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206419 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206444 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206489 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206511 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206562 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206568 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206603 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.206607 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.302585 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.304101 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.304141 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.304152 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.304182 4757 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 12:46:55 crc kubenswrapper[4757]: E1216 12:46:55.304678 4757 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.384471 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.414771 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: W1216 12:46:55.417457 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-cacc86879fbd4ee7694a9634f1e1f40d7e15e3a6d641eb4b9294efdb6e8c1be4 WatchSource:0}: Error finding container cacc86879fbd4ee7694a9634f1e1f40d7e15e3a6d641eb4b9294efdb6e8c1be4: Status 404 returned error can't find the container with id cacc86879fbd4ee7694a9634f1e1f40d7e15e3a6d641eb4b9294efdb6e8c1be4 Dec 16 12:46:55 crc kubenswrapper[4757]: W1216 12:46:55.433997 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-16ef17ef542df6e1c7988eae070e9b977bd2a6a51f018fcad2f0650af8bfda6b WatchSource:0}: Error finding container 16ef17ef542df6e1c7988eae070e9b977bd2a6a51f018fcad2f0650af8bfda6b: Status 404 returned error can't find the container with id 16ef17ef542df6e1c7988eae070e9b977bd2a6a51f018fcad2f0650af8bfda6b Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.437029 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.446913 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: W1216 12:46:55.449611 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-fbf9d90cb5917986a4a105761ab2c11a2d9d7b9079b2c6afd1109052540bb555 WatchSource:0}: Error finding container fbf9d90cb5917986a4a105761ab2c11a2d9d7b9079b2c6afd1109052540bb555: Status 404 returned error can't find the container with id fbf9d90cb5917986a4a105761ab2c11a2d9d7b9079b2c6afd1109052540bb555 Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.454433 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 12:46:55 crc kubenswrapper[4757]: W1216 12:46:55.460814 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-452bf56075641d16448f5dd9da545ed6453e64653b400bc1a9c7ec6e47600f43 WatchSource:0}: Error finding container 452bf56075641d16448f5dd9da545ed6453e64653b400bc1a9c7ec6e47600f43: Status 404 returned error can't find the container with id 452bf56075641d16448f5dd9da545ed6453e64653b400bc1a9c7ec6e47600f43 Dec 16 12:46:55 crc kubenswrapper[4757]: E1216 12:46:55.488424 4757 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="800ms" Dec 16 12:46:55 crc kubenswrapper[4757]: W1216 12:46:55.703792 4757 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 16 12:46:55 crc kubenswrapper[4757]: E1216 12:46:55.703876 4757 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.705384 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.707145 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.707179 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.707189 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.707214 4757 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 12:46:55 crc kubenswrapper[4757]: E1216 12:46:55.707502 4757 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.869992 4757 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.884748 4757 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 09:15:12.288271654 +0000 UTC Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.952587 4757 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="4884b6fbb86d4ba984e0066dc8bd18f65626bce7244d43655d45980d4246a9c7" exitCode=0 Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.952657 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"4884b6fbb86d4ba984e0066dc8bd18f65626bce7244d43655d45980d4246a9c7"} Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.952739 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8b7b92717655263eb164c9e972313502e55f3d5f062d7073f22d352dab4ffdd1"} Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.952822 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.954686 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.954716 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.954727 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.956700 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c"} Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.956745 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"452bf56075641d16448f5dd9da545ed6453e64653b400bc1a9c7ec6e47600f43"} Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.959182 4757 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58" exitCode=0 Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.959247 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58"} Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.959270 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fbf9d90cb5917986a4a105761ab2c11a2d9d7b9079b2c6afd1109052540bb555"} Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.959358 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.959955 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.959981 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.959992 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.960126 4757 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257" exitCode=0 Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.960172 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257"} Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.960188 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"16ef17ef542df6e1c7988eae070e9b977bd2a6a51f018fcad2f0650af8bfda6b"} Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.960265 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.960786 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.960805 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.960813 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.962325 4757 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="43b37e293c8b418fa9c430f42d725bb63b8a9f22a3e42aad4939916896e3fbcd" exitCode=0 Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.962354 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"43b37e293c8b418fa9c430f42d725bb63b8a9f22a3e42aad4939916896e3fbcd"} Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.962370 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cacc86879fbd4ee7694a9634f1e1f40d7e15e3a6d641eb4b9294efdb6e8c1be4"} Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.962416 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.963843 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.964108 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.964131 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.964139 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.964903 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.964928 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:55 crc kubenswrapper[4757]: I1216 12:46:55.964939 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:56 crc kubenswrapper[4757]: W1216 12:46:56.015913 4757 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 16 12:46:56 crc kubenswrapper[4757]: E1216 12:46:56.015979 4757 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:46:56 crc kubenswrapper[4757]: W1216 12:46:56.160690 4757 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 16 12:46:56 crc kubenswrapper[4757]: E1216 12:46:56.160838 4757 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:46:56 crc kubenswrapper[4757]: W1216 12:46:56.197964 4757 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 16 12:46:56 crc kubenswrapper[4757]: E1216 12:46:56.198070 4757 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:46:56 crc kubenswrapper[4757]: E1216 12:46:56.289205 4757 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="1.6s" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.507738 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.509040 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.509076 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.509088 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.509111 4757 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 12:46:56 crc kubenswrapper[4757]: E1216 12:46:56.509540 4757 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.110:6443: connect: connection refused" node="crc" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.824609 4757 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 16 12:46:56 crc kubenswrapper[4757]: E1216 12:46:56.825653 4757 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.110:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.869892 4757 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.110:6443: connect: connection refused Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.885261 4757 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 19:45:55.279464185 +0000 UTC Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.967304 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28"} Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.967341 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa"} Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.967351 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a"} Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.967360 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04"} Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.968740 4757 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1" exitCode=0 Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.968797 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1"} Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.968916 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.969869 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.969895 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.969903 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.971296 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c5f662a9a0f32f67d2c0c9018c4324bd101f82c13f1fc031a545ca559f4b1df5"} Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.971365 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.971928 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.971945 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.971963 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.973196 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"47d0598704fd1564a6a744ba766dc51c45a02ba411a6a8b151cc615e52792632"} Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.973221 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"220e8c9a89d13c302c004e356c18732f517190cc6651a0116e935d3304a0b566"} Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.973235 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"928dcf59424909a1463c643874dd93e265bce2029cbf595ed81ad3a8fad2c0ee"} Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.973333 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.973868 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.973893 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.973902 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.975380 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f"} Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.975402 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53"} Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.975411 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18"} Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.975430 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.975904 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.975927 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:56 crc kubenswrapper[4757]: I1216 12:46:56.975935 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:57 crc kubenswrapper[4757]: I1216 12:46:57.885617 4757 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 20:14:22.413124523 +0000 UTC Dec 16 12:46:57 crc kubenswrapper[4757]: I1216 12:46:57.885670 4757 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 487h27m24.527457114s for next certificate rotation Dec 16 12:46:57 crc kubenswrapper[4757]: I1216 12:46:57.981255 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9"} Dec 16 12:46:57 crc kubenswrapper[4757]: I1216 12:46:57.981486 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:57 crc kubenswrapper[4757]: I1216 12:46:57.982606 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:57 crc kubenswrapper[4757]: I1216 12:46:57.982682 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:57 crc kubenswrapper[4757]: I1216 12:46:57.982707 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:57 crc kubenswrapper[4757]: I1216 12:46:57.984607 4757 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23" exitCode=0 Dec 16 12:46:57 crc kubenswrapper[4757]: I1216 12:46:57.984677 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23"} Dec 16 12:46:57 crc kubenswrapper[4757]: I1216 12:46:57.984732 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:57 crc kubenswrapper[4757]: I1216 12:46:57.984784 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:57 crc kubenswrapper[4757]: I1216 12:46:57.988571 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:57 crc kubenswrapper[4757]: I1216 12:46:57.988607 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:57 crc kubenswrapper[4757]: I1216 12:46:57.988616 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:57 crc kubenswrapper[4757]: I1216 12:46:57.988636 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:57 crc kubenswrapper[4757]: I1216 12:46:57.988678 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:57 crc kubenswrapper[4757]: I1216 12:46:57.988713 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.110270 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.111266 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.111304 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.111316 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.111338 4757 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.990314 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480"} Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.990356 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c"} Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.990366 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954"} Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.990374 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146"} Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.990382 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea"} Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.990427 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.990475 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.990532 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.991444 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.991454 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.991466 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.991475 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.991476 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:58 crc kubenswrapper[4757]: I1216 12:46:58.991580 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:59 crc kubenswrapper[4757]: I1216 12:46:59.121800 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 16 12:46:59 crc kubenswrapper[4757]: I1216 12:46:59.241746 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:46:59 crc kubenswrapper[4757]: I1216 12:46:59.992774 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:59 crc kubenswrapper[4757]: I1216 12:46:59.992853 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:46:59 crc kubenswrapper[4757]: I1216 12:46:59.993691 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:59 crc kubenswrapper[4757]: I1216 12:46:59.993727 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:59 crc kubenswrapper[4757]: I1216 12:46:59.993739 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:46:59 crc kubenswrapper[4757]: I1216 12:46:59.993796 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:46:59 crc kubenswrapper[4757]: I1216 12:46:59.993814 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:46:59 crc kubenswrapper[4757]: I1216 12:46:59.993835 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:00 crc kubenswrapper[4757]: I1216 12:47:00.417457 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:47:01 crc kubenswrapper[4757]: I1216 12:47:01.129201 4757 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 16 12:47:01 crc kubenswrapper[4757]: I1216 12:47:01.133288 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:47:01 crc kubenswrapper[4757]: I1216 12:47:01.134403 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:01 crc kubenswrapper[4757]: I1216 12:47:01.134437 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:01 crc kubenswrapper[4757]: I1216 12:47:01.134446 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:01 crc kubenswrapper[4757]: I1216 12:47:01.516580 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:47:01 crc kubenswrapper[4757]: I1216 12:47:01.516731 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:47:01 crc kubenswrapper[4757]: I1216 12:47:01.517721 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:01 crc kubenswrapper[4757]: I1216 12:47:01.517757 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:01 crc kubenswrapper[4757]: I1216 12:47:01.517768 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:02 crc kubenswrapper[4757]: I1216 12:47:02.134651 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:47:02 crc kubenswrapper[4757]: I1216 12:47:02.135357 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:02 crc kubenswrapper[4757]: I1216 12:47:02.135393 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:02 crc kubenswrapper[4757]: I1216 12:47:02.135403 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:03 crc kubenswrapper[4757]: I1216 12:47:03.190450 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 16 12:47:03 crc kubenswrapper[4757]: I1216 12:47:03.190968 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:47:03 crc kubenswrapper[4757]: I1216 12:47:03.192178 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:03 crc kubenswrapper[4757]: I1216 12:47:03.192210 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:03 crc kubenswrapper[4757]: I1216 12:47:03.192248 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:05 crc kubenswrapper[4757]: E1216 12:47:05.010214 4757 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 16 12:47:05 crc kubenswrapper[4757]: I1216 12:47:05.378296 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:47:05 crc kubenswrapper[4757]: I1216 12:47:05.378461 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:47:05 crc kubenswrapper[4757]: I1216 12:47:05.379929 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:05 crc kubenswrapper[4757]: I1216 12:47:05.379985 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:05 crc kubenswrapper[4757]: I1216 12:47:05.379994 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:05 crc kubenswrapper[4757]: I1216 12:47:05.409256 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:47:05 crc kubenswrapper[4757]: I1216 12:47:05.690734 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 12:47:05 crc kubenswrapper[4757]: I1216 12:47:05.690914 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:47:05 crc kubenswrapper[4757]: I1216 12:47:05.692068 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:05 crc kubenswrapper[4757]: I1216 12:47:05.692116 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:05 crc kubenswrapper[4757]: I1216 12:47:05.692125 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:06 crc kubenswrapper[4757]: I1216 12:47:06.143864 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:47:06 crc kubenswrapper[4757]: I1216 12:47:06.145455 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:06 crc kubenswrapper[4757]: I1216 12:47:06.145502 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:06 crc kubenswrapper[4757]: I1216 12:47:06.145512 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:06 crc kubenswrapper[4757]: I1216 12:47:06.291164 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:47:06 crc kubenswrapper[4757]: I1216 12:47:06.297156 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:47:07 crc kubenswrapper[4757]: I1216 12:47:07.147244 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:47:07 crc kubenswrapper[4757]: I1216 12:47:07.148548 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:07 crc kubenswrapper[4757]: I1216 12:47:07.148596 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:07 crc kubenswrapper[4757]: I1216 12:47:07.148609 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:07 crc kubenswrapper[4757]: I1216 12:47:07.151145 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:47:07 crc kubenswrapper[4757]: I1216 12:47:07.159506 4757 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 16 12:47:07 crc kubenswrapper[4757]: I1216 12:47:07.159580 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 16 12:47:07 crc kubenswrapper[4757]: I1216 12:47:07.516026 4757 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 16 12:47:07 crc kubenswrapper[4757]: I1216 12:47:07.516082 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 16 12:47:07 crc kubenswrapper[4757]: I1216 12:47:07.526944 4757 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 16 12:47:07 crc kubenswrapper[4757]: I1216 12:47:07.527022 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 16 12:47:08 crc kubenswrapper[4757]: I1216 12:47:08.150515 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:47:08 crc kubenswrapper[4757]: I1216 12:47:08.151663 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:08 crc kubenswrapper[4757]: I1216 12:47:08.151706 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:08 crc kubenswrapper[4757]: I1216 12:47:08.151725 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:08 crc kubenswrapper[4757]: I1216 12:47:08.410210 4757 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 12:47:08 crc kubenswrapper[4757]: I1216 12:47:08.410285 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 12:47:09 crc kubenswrapper[4757]: I1216 12:47:09.152250 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:47:09 crc kubenswrapper[4757]: I1216 12:47:09.153190 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:09 crc kubenswrapper[4757]: I1216 12:47:09.153310 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:09 crc kubenswrapper[4757]: I1216 12:47:09.153343 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:09 crc kubenswrapper[4757]: I1216 12:47:09.248796 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:47:09 crc kubenswrapper[4757]: I1216 12:47:09.248937 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:47:09 crc kubenswrapper[4757]: I1216 12:47:09.249920 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:09 crc kubenswrapper[4757]: I1216 12:47:09.249963 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:09 crc kubenswrapper[4757]: I1216 12:47:09.249980 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:09 crc kubenswrapper[4757]: I1216 12:47:09.253628 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:47:10 crc kubenswrapper[4757]: I1216 12:47:10.154217 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:47:10 crc kubenswrapper[4757]: I1216 12:47:10.155112 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:10 crc kubenswrapper[4757]: I1216 12:47:10.155140 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:10 crc kubenswrapper[4757]: I1216 12:47:10.155150 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:12 crc kubenswrapper[4757]: E1216 12:47:12.525450 4757 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 16 12:47:12 crc kubenswrapper[4757]: I1216 12:47:12.527568 4757 trace.go:236] Trace[1369873262]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 12:46:58.180) (total time: 14346ms): Dec 16 12:47:12 crc kubenswrapper[4757]: Trace[1369873262]: ---"Objects listed" error: 14346ms (12:47:12.527) Dec 16 12:47:12 crc kubenswrapper[4757]: Trace[1369873262]: [14.346942013s] [14.346942013s] END Dec 16 12:47:12 crc kubenswrapper[4757]: I1216 12:47:12.527595 4757 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 16 12:47:12 crc kubenswrapper[4757]: I1216 12:47:12.527775 4757 trace.go:236] Trace[726652835]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 12:46:58.368) (total time: 14159ms): Dec 16 12:47:12 crc kubenswrapper[4757]: Trace[726652835]: ---"Objects listed" error: 14159ms (12:47:12.527) Dec 16 12:47:12 crc kubenswrapper[4757]: Trace[726652835]: [14.159515244s] [14.159515244s] END Dec 16 12:47:12 crc kubenswrapper[4757]: I1216 12:47:12.527801 4757 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 16 12:47:12 crc kubenswrapper[4757]: I1216 12:47:12.528365 4757 trace.go:236] Trace[210939322]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 12:46:58.558) (total time: 13969ms): Dec 16 12:47:12 crc kubenswrapper[4757]: Trace[210939322]: ---"Objects listed" error: 13969ms (12:47:12.528) Dec 16 12:47:12 crc kubenswrapper[4757]: Trace[210939322]: [13.969575536s] [13.969575536s] END Dec 16 12:47:12 crc kubenswrapper[4757]: I1216 12:47:12.528384 4757 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 16 12:47:12 crc kubenswrapper[4757]: E1216 12:47:12.529023 4757 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 16 12:47:12 crc kubenswrapper[4757]: I1216 12:47:12.529717 4757 trace.go:236] Trace[1818745817]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 12:46:57.950) (total time: 14578ms): Dec 16 12:47:12 crc kubenswrapper[4757]: Trace[1818745817]: ---"Objects listed" error: 14578ms (12:47:12.529) Dec 16 12:47:12 crc kubenswrapper[4757]: Trace[1818745817]: [14.578626806s] [14.578626806s] END Dec 16 12:47:12 crc kubenswrapper[4757]: I1216 12:47:12.529770 4757 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 16 12:47:12 crc kubenswrapper[4757]: I1216 12:47:12.534630 4757 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 16 12:47:12 crc kubenswrapper[4757]: I1216 12:47:12.554142 4757 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 16 12:47:12 crc kubenswrapper[4757]: I1216 12:47:12.594979 4757 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60288->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 16 12:47:12 crc kubenswrapper[4757]: I1216 12:47:12.595065 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60288->192.168.126.11:17697: read: connection reset by peer" Dec 16 12:47:12 crc kubenswrapper[4757]: I1216 12:47:12.595162 4757 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37544->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 16 12:47:12 crc kubenswrapper[4757]: I1216 12:47:12.595242 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37544->192.168.126.11:17697: read: connection reset by peer" Dec 16 12:47:12 crc kubenswrapper[4757]: I1216 12:47:12.595754 4757 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 16 12:47:12 crc kubenswrapper[4757]: I1216 12:47:12.596820 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.136540 4757 apiserver.go:52] "Watching apiserver" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.140029 4757 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.140269 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.140689 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.140747 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.140795 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.141036 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.141094 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.141492 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.141689 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.141729 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.142094 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.143745 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.144561 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.144666 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.144835 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.144605 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.145179 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.146152 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.148219 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.148429 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.162652 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.165138 4757 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9" exitCode=255 Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.165172 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9"} Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.173077 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.176649 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.176924 4757 scope.go:117] "RemoveContainer" containerID="568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.184538 4757 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.189981 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.208828 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.222934 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.234112 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.238971 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.239256 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.239451 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.239579 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.239687 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.239800 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.239927 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240064 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240182 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.239346 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.239425 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240122 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240163 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240290 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240546 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240546 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240310 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240640 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240668 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240692 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240714 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240737 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240760 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240781 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240805 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240826 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240847 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240868 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240889 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240922 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240948 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240972 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.240993 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241069 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241097 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241121 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241145 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241171 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241196 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241220 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241245 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241269 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241298 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241330 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241353 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241377 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241402 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241427 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241453 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241478 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241525 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241551 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241574 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241599 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241631 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241659 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241684 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241768 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241793 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241818 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241842 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241866 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241889 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241901 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241911 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241902 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.241954 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242104 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242128 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242066 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242196 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242273 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242272 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242296 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242307 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242332 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242354 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242373 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242394 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242411 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242428 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242445 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242467 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242492 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242523 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242544 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242565 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242585 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242603 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242611 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242621 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242641 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242662 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242680 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242697 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242716 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242736 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242737 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242747 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242754 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242777 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242794 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242815 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242833 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242852 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242870 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242884 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242893 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242912 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242933 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242952 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242972 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.242989 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243021 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243040 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243060 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243077 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243097 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243115 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243163 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243183 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243203 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243213 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243222 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243274 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243302 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243306 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243350 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243374 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243397 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243420 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243444 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243469 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243475 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243490 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243513 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243484 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243536 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243555 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243578 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243599 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243619 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243643 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243669 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243689 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243710 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243728 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243748 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243768 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243788 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243809 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243830 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243850 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243866 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243886 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243904 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243925 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243942 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243965 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243985 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.245082 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.245120 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.252664 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.243664 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.245163 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.245241 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.245338 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.245384 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.245566 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:47:13.745132533 +0000 UTC m=+19.172876329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.252892 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.252928 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.252951 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.252970 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.252991 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253027 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253045 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253068 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253085 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253101 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253117 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253133 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253149 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253166 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253186 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253211 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253240 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253262 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253280 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253305 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253334 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253354 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253370 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253386 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253402 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253419 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253435 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253459 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253475 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253511 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253531 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253547 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253579 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253595 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253614 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253633 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253651 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253667 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253684 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253701 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253718 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253734 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253752 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253772 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253790 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253806 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253824 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253839 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253856 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253873 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253889 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.253906 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254106 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254126 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254145 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254194 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254216 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254237 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254258 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254302 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254320 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254339 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254360 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254377 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254397 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254413 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254421 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254431 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254488 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254523 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254615 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254636 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254651 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254666 4757 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254680 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254694 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254710 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254725 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254737 4757 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254750 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254764 4757 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254778 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254791 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254804 4757 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.254873 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.255117 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.255184 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.255413 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.255512 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.255807 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.255975 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.256236 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.256499 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.256963 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.257995 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.258131 4757 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.259867 4757 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.260476 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.260991 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.266626 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.261356 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.247900 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.247916 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.248159 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.248373 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.248977 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.249077 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.249120 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.249154 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.249275 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.249330 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.249358 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.249442 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.249717 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.249481 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.249844 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.249953 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.250028 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.261465 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.261598 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.261707 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.261775 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.262028 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.262277 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.262328 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.262854 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:13.762826534 +0000 UTC m=+19.190570330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.265435 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.265671 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.265835 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.266696 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.266931 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.267181 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.267475 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.267481 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.267577 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.267859 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.268082 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.268144 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.268182 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.268383 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.268662 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.268672 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.269095 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.269596 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.270179 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.270477 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.270590 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.247819 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.271458 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.271931 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.271968 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.272047 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.272186 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.272209 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.272460 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.272561 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.272808 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.272822 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.262926 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.273305 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.273330 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.273343 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.262748 4757 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.273401 4757 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.273415 4757 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.273557 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.273574 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.273632 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.273649 4757 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.273661 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.273674 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.273685 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.273697 4757 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.273708 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.273645 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.273815 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.273883 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.274158 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.274208 4757 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.274267 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:13.774246613 +0000 UTC m=+19.201990409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.274784 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.276082 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.276543 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.278292 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.279378 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.281289 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.281454 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.284496 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.285720 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.286173 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.290413 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.290664 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.290686 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.290698 4757 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.290747 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:13.790729594 +0000 UTC m=+19.218473470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.292499 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.292499 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.293118 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.295557 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.296110 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.296260 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.296346 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.296405 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.297803 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.301207 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.301455 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.301711 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.302137 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.302342 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.302370 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.302385 4757 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.302591 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:13.802420459 +0000 UTC m=+19.230164445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.302868 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.302951 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.303229 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.303396 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.303445 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.303560 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.303935 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.304920 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.304937 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.305295 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.305733 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.305935 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.308132 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.306085 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.306111 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.305979 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.306726 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.306820 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.306717 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.307206 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.307429 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.307500 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.307611 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.307639 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.307829 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.307883 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.307942 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.307961 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.308077 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.308119 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.308641 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.308734 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.308906 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.309131 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.310993 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.311107 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.311367 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.313366 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.313673 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.315500 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.316046 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.316424 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.316726 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.316777 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.316816 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.316941 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.316966 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.317127 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.317171 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.317251 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.319308 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.319496 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.319481 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.319945 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.320313 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.320462 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.320796 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.320859 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.321367 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.321787 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.321912 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.323029 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.323221 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.323220 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.323735 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.323867 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.323917 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.325940 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.326066 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.326657 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.328621 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.332821 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.337021 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.341350 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.343760 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.351414 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.353329 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.355164 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.362617 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.372240 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374587 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374621 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374672 4757 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374683 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374694 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374705 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374714 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374723 4757 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374732 4757 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374740 4757 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374748 4757 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374756 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374765 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374778 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374787 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374795 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374803 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374811 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374819 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374827 4757 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374835 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374843 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374853 4757 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374861 4757 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374868 4757 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374876 4757 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374885 4757 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374893 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374837 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374913 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374902 4757 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.374995 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375046 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375059 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375072 4757 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375084 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375114 4757 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375127 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375138 4757 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375153 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375167 4757 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375199 4757 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375212 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375223 4757 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375232 4757 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375242 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375253 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375285 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375296 4757 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375307 4757 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375317 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375331 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375364 4757 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375375 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375387 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375398 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375409 4757 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375441 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375453 4757 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375464 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375474 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375486 4757 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375520 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375533 4757 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375545 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375558 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375569 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375602 4757 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375616 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375627 4757 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375637 4757 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375647 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375677 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375688 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375699 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375711 4757 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375721 4757 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375732 4757 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375765 4757 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375774 4757 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375783 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375791 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375801 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375809 4757 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375836 4757 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375848 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375858 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375869 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375881 4757 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375911 4757 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375922 4757 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375933 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375943 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375953 4757 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375964 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375988 4757 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.375998 4757 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376032 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376045 4757 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376055 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376066 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376076 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376107 4757 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376119 4757 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376129 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376138 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376151 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376162 4757 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376188 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376198 4757 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376211 4757 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376223 4757 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376235 4757 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376262 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376271 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376281 4757 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376290 4757 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376305 4757 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376316 4757 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376344 4757 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376355 4757 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376366 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376375 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376384 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376393 4757 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376422 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376433 4757 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376444 4757 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376455 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376466 4757 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376496 4757 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376507 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376518 4757 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376532 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376543 4757 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376554 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376582 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376591 4757 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376602 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376613 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376623 4757 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376633 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376660 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376669 4757 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376678 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376687 4757 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376696 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376706 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376733 4757 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376742 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376751 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376760 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376769 4757 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376778 4757 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376788 4757 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376812 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376821 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376831 4757 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376841 4757 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376852 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376862 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376892 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376904 4757 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376916 4757 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376925 4757 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376935 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376945 4757 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376973 4757 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376982 4757 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.376991 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.377019 4757 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.377029 4757 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.380747 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.390527 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.400329 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.411265 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.421464 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.442871 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.455111 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.462878 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.472718 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 12:47:13 crc kubenswrapper[4757]: W1216 12:47:13.476767 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-8313814f62458e6d1fa320099e421cb67d41bb6b3d66353ee47e997a6aa1d6b0 WatchSource:0}: Error finding container 8313814f62458e6d1fa320099e421cb67d41bb6b3d66353ee47e997a6aa1d6b0: Status 404 returned error can't find the container with id 8313814f62458e6d1fa320099e421cb67d41bb6b3d66353ee47e997a6aa1d6b0 Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.781132 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.781356 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:47:14.781298877 +0000 UTC m=+20.209042673 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.781615 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.781655 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.781697 4757 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.781731 4757 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.781747 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:14.781740158 +0000 UTC m=+20.209483954 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.781786 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:14.781770439 +0000 UTC m=+20.209514315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.882642 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:13 crc kubenswrapper[4757]: I1216 12:47:13.882680 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.882808 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.882827 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.882839 4757 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.882871 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.882898 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:14.882881572 +0000 UTC m=+20.310625368 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.882906 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.882919 4757 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:13 crc kubenswrapper[4757]: E1216 12:47:13.882985 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:14.882963594 +0000 UTC m=+20.310707470 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.148066 4757 csr.go:261] certificate signing request csr-rwdmd is approved, waiting to be issued Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.169240 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.171202 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57"} Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.171408 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.172926 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ab2399b10267c8c9543a43ea6eaec6a83988f8ac0410158f7cb874dd1c918133"} Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.174555 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893"} Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.174603 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6"} Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.174613 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8313814f62458e6d1fa320099e421cb67d41bb6b3d66353ee47e997a6aa1d6b0"} Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.175756 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0"} Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.175790 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"be858c3f234a757b126e28b066bd21ff2858f5ab9e2e67bedf691d30f3c08cc5"} Dec 16 12:47:14 crc kubenswrapper[4757]: E1216 12:47:14.184487 4757 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.209531 4757 csr.go:257] certificate signing request csr-rwdmd is issued Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.210141 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.231583 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.250786 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.292310 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.326991 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.347586 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.378414 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.427744 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.469566 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.535310 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.554723 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.566927 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.585618 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.601291 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.627815 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.643906 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.789763 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.789818 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.789852 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:14 crc kubenswrapper[4757]: E1216 12:47:14.789960 4757 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 12:47:14 crc kubenswrapper[4757]: E1216 12:47:14.789994 4757 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 12:47:14 crc kubenswrapper[4757]: E1216 12:47:14.790027 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:16.790014165 +0000 UTC m=+22.217757961 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 12:47:14 crc kubenswrapper[4757]: E1216 12:47:14.790104 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:16.790087077 +0000 UTC m=+22.217830873 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 12:47:14 crc kubenswrapper[4757]: E1216 12:47:14.790175 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:47:16.790166969 +0000 UTC m=+22.217910765 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.802997 4757 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 16 12:47:14 crc kubenswrapper[4757]: E1216 12:47:14.803370 4757 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/events/networking-console-plugin-85b44fc459-gdk6g.1881b2e6cdff6a8c\": read tcp 38.102.83.110:35258->38.102.83.110:6443: use of closed network connection" event="&Event{ObjectMeta:{networking-console-plugin-85b44fc459-gdk6g.1881b2e6cdff6a8c openshift-network-console 26298 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-network-console,Name:networking-console-plugin-85b44fc459-gdk6g,UID:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8,APIVersion:v1,ResourceVersion:25349,FieldPath:,},Reason:FailedMount,Message:MountVolume.SetUp failed for volume \"nginx-conf\" : object \"openshift-network-console\"/\"networking-console-plugin\" not registered,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-16 12:47:13 +0000 UTC,LastTimestamp:2025-12-16 12:47:14.790052846 +0000 UTC m=+20.217796642,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.891235 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.891279 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:14 crc kubenswrapper[4757]: E1216 12:47:14.891417 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 12:47:14 crc kubenswrapper[4757]: E1216 12:47:14.891435 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 12:47:14 crc kubenswrapper[4757]: E1216 12:47:14.891448 4757 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:14 crc kubenswrapper[4757]: E1216 12:47:14.891471 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 12:47:14 crc kubenswrapper[4757]: E1216 12:47:14.891509 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 12:47:14 crc kubenswrapper[4757]: E1216 12:47:14.891526 4757 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:14 crc kubenswrapper[4757]: E1216 12:47:14.891509 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:16.891492677 +0000 UTC m=+22.319236473 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:14 crc kubenswrapper[4757]: E1216 12:47:14.891612 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:16.89159115 +0000 UTC m=+22.319335016 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.948479 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.948518 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.948575 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:14 crc kubenswrapper[4757]: E1216 12:47:14.948597 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:14 crc kubenswrapper[4757]: E1216 12:47:14.948702 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:14 crc kubenswrapper[4757]: E1216 12:47:14.949115 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.952769 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.953512 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.954560 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.955366 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.956076 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.957529 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.958224 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.959250 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.959849 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.960727 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.961242 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.962327 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.962868 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.963455 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.964380 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.964983 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.965968 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.966477 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.967038 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.968111 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.968629 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.969708 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.970184 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.971257 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.971689 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.972414 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.973737 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.974479 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.975482 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.975946 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.976906 4757 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.977070 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.978650 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.979472 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.979864 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.981452 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.984273 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.984984 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.986187 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.986927 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.988038 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.988666 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.989729 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.990336 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.991326 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.991868 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.992772 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.993537 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.995245 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.995661 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.996101 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.996995 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.997576 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 16 12:47:14 crc kubenswrapper[4757]: I1216 12:47:14.998485 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.008751 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.017635 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-xhz4k"] Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.017862 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-tm6vt"] Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.018030 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xhz4k" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.018110 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.022799 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.022972 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.023105 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.039323 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.039397 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.039475 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.039747 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.039785 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.039886 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-cz9q7"] Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.040258 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-8lq2b"] Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.040773 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.042098 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.058018 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.058569 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.063371 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.063386 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.063431 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.067367 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.076723 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.078341 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.115027 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.132976 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.154897 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.193313 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.195956 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/494d6b1a-0610-4a79-be5d-3c7e54f5c2eb-hosts-file\") pod \"node-resolver-xhz4k\" (UID: \"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\") " pod="openshift-dns/node-resolver-xhz4k" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196000 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-hostroot\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196032 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-os-release\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196047 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/395610a4-58ca-497e-93a6-714bd6c111c1-cni-binary-copy\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196061 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-multus-conf-dir\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196098 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/43be7319-eac3-4e51-9560-e12d51e97ca6-mcd-auth-proxy-config\") pod \"machine-config-daemon-tm6vt\" (UID: \"43be7319-eac3-4e51-9560-e12d51e97ca6\") " pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196112 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-host-var-lib-kubelet\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196127 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e68497b0-de41-4a06-a7ca-2944fded6bd9-system-cni-dir\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196142 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb266\" (UniqueName: \"kubernetes.io/projected/494d6b1a-0610-4a79-be5d-3c7e54f5c2eb-kube-api-access-tb266\") pod \"node-resolver-xhz4k\" (UID: \"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\") " pod="openshift-dns/node-resolver-xhz4k" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196176 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/43be7319-eac3-4e51-9560-e12d51e97ca6-rootfs\") pod \"machine-config-daemon-tm6vt\" (UID: \"43be7319-eac3-4e51-9560-e12d51e97ca6\") " pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196195 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-system-cni-dir\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196217 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e68497b0-de41-4a06-a7ca-2944fded6bd9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196233 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8xqs\" (UniqueName: \"kubernetes.io/projected/e68497b0-de41-4a06-a7ca-2944fded6bd9-kube-api-access-r8xqs\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196248 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-etc-kubernetes\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196262 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtgw6\" (UniqueName: \"kubernetes.io/projected/395610a4-58ca-497e-93a6-714bd6c111c1-kube-api-access-qtgw6\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196279 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-multus-cni-dir\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196297 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-host-run-netns\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196315 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-host-run-k8s-cni-cncf-io\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196333 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-cnibin\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196355 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/395610a4-58ca-497e-93a6-714bd6c111c1-multus-daemon-config\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196371 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-host-run-multus-certs\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196389 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sx9t\" (UniqueName: \"kubernetes.io/projected/43be7319-eac3-4e51-9560-e12d51e97ca6-kube-api-access-9sx9t\") pod \"machine-config-daemon-tm6vt\" (UID: \"43be7319-eac3-4e51-9560-e12d51e97ca6\") " pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196408 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-host-var-lib-cni-bin\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196428 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-host-var-lib-cni-multus\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196446 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e68497b0-de41-4a06-a7ca-2944fded6bd9-cni-binary-copy\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196462 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e68497b0-de41-4a06-a7ca-2944fded6bd9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196480 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-multus-socket-dir-parent\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196501 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43be7319-eac3-4e51-9560-e12d51e97ca6-proxy-tls\") pod \"machine-config-daemon-tm6vt\" (UID: \"43be7319-eac3-4e51-9560-e12d51e97ca6\") " pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196519 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e68497b0-de41-4a06-a7ca-2944fded6bd9-cnibin\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.196537 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e68497b0-de41-4a06-a7ca-2944fded6bd9-os-release\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.210861 4757 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-16 12:42:14 +0000 UTC, rotation deadline is 2026-09-02 22:03:14.09957424 +0000 UTC Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.210922 4757 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6249h15m58.88865472s for next certificate rotation Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.222526 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.249055 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.266205 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.285157 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297189 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb266\" (UniqueName: \"kubernetes.io/projected/494d6b1a-0610-4a79-be5d-3c7e54f5c2eb-kube-api-access-tb266\") pod \"node-resolver-xhz4k\" (UID: \"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\") " pod="openshift-dns/node-resolver-xhz4k" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297247 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-host-var-lib-kubelet\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297282 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e68497b0-de41-4a06-a7ca-2944fded6bd9-system-cni-dir\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297306 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/43be7319-eac3-4e51-9560-e12d51e97ca6-rootfs\") pod \"machine-config-daemon-tm6vt\" (UID: \"43be7319-eac3-4e51-9560-e12d51e97ca6\") " pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297329 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-system-cni-dir\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297362 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e68497b0-de41-4a06-a7ca-2944fded6bd9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297386 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xqs\" (UniqueName: \"kubernetes.io/projected/e68497b0-de41-4a06-a7ca-2944fded6bd9-kube-api-access-r8xqs\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297407 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-etc-kubernetes\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297428 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtgw6\" (UniqueName: \"kubernetes.io/projected/395610a4-58ca-497e-93a6-714bd6c111c1-kube-api-access-qtgw6\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297447 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-multus-cni-dir\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297467 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-host-run-k8s-cni-cncf-io\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297485 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-host-run-netns\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297506 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-cnibin\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297527 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-host-var-lib-cni-multus\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297568 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/395610a4-58ca-497e-93a6-714bd6c111c1-multus-daemon-config\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297589 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-host-run-multus-certs\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297612 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sx9t\" (UniqueName: \"kubernetes.io/projected/43be7319-eac3-4e51-9560-e12d51e97ca6-kube-api-access-9sx9t\") pod \"machine-config-daemon-tm6vt\" (UID: \"43be7319-eac3-4e51-9560-e12d51e97ca6\") " pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297633 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-host-var-lib-cni-bin\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297656 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e68497b0-de41-4a06-a7ca-2944fded6bd9-cni-binary-copy\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297683 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e68497b0-de41-4a06-a7ca-2944fded6bd9-os-release\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297695 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-host-var-lib-kubelet\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297705 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e68497b0-de41-4a06-a7ca-2944fded6bd9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297751 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-multus-socket-dir-parent\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297771 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43be7319-eac3-4e51-9560-e12d51e97ca6-proxy-tls\") pod \"machine-config-daemon-tm6vt\" (UID: \"43be7319-eac3-4e51-9560-e12d51e97ca6\") " pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297786 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e68497b0-de41-4a06-a7ca-2944fded6bd9-cnibin\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297802 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/494d6b1a-0610-4a79-be5d-3c7e54f5c2eb-hosts-file\") pod \"node-resolver-xhz4k\" (UID: \"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\") " pod="openshift-dns/node-resolver-xhz4k" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297818 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-hostroot\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297843 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-os-release\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297860 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/395610a4-58ca-497e-93a6-714bd6c111c1-cni-binary-copy\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297873 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-multus-conf-dir\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.297904 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/43be7319-eac3-4e51-9560-e12d51e97ca6-mcd-auth-proxy-config\") pod \"machine-config-daemon-tm6vt\" (UID: \"43be7319-eac3-4e51-9560-e12d51e97ca6\") " pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.298494 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e68497b0-de41-4a06-a7ca-2944fded6bd9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.298526 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e68497b0-de41-4a06-a7ca-2944fded6bd9-system-cni-dir\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.298502 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/43be7319-eac3-4e51-9560-e12d51e97ca6-mcd-auth-proxy-config\") pod \"machine-config-daemon-tm6vt\" (UID: \"43be7319-eac3-4e51-9560-e12d51e97ca6\") " pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.298555 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/43be7319-eac3-4e51-9560-e12d51e97ca6-rootfs\") pod \"machine-config-daemon-tm6vt\" (UID: \"43be7319-eac3-4e51-9560-e12d51e97ca6\") " pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.298582 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-etc-kubernetes\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.298591 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-system-cni-dir\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.298690 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/494d6b1a-0610-4a79-be5d-3c7e54f5c2eb-hosts-file\") pod \"node-resolver-xhz4k\" (UID: \"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\") " pod="openshift-dns/node-resolver-xhz4k" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.298725 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-multus-socket-dir-parent\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.298850 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-host-run-multus-certs\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.298932 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e68497b0-de41-4a06-a7ca-2944fded6bd9-cnibin\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.298959 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-host-run-netns\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.299048 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-multus-conf-dir\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.299128 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-hostroot\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.299139 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-cnibin\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.299167 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-host-var-lib-cni-bin\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.299178 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-host-run-k8s-cni-cncf-io\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.299183 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e68497b0-de41-4a06-a7ca-2944fded6bd9-os-release\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.299192 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-host-var-lib-cni-multus\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.299178 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-multus-cni-dir\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.299493 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/395610a4-58ca-497e-93a6-714bd6c111c1-os-release\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.299608 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e68497b0-de41-4a06-a7ca-2944fded6bd9-cni-binary-copy\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.299631 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/395610a4-58ca-497e-93a6-714bd6c111c1-multus-daemon-config\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.299506 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e68497b0-de41-4a06-a7ca-2944fded6bd9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.299953 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/395610a4-58ca-497e-93a6-714bd6c111c1-cni-binary-copy\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.301470 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.308330 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43be7319-eac3-4e51-9560-e12d51e97ca6-proxy-tls\") pod \"machine-config-daemon-tm6vt\" (UID: \"43be7319-eac3-4e51-9560-e12d51e97ca6\") " pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.316970 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sx9t\" (UniqueName: \"kubernetes.io/projected/43be7319-eac3-4e51-9560-e12d51e97ca6-kube-api-access-9sx9t\") pod \"machine-config-daemon-tm6vt\" (UID: \"43be7319-eac3-4e51-9560-e12d51e97ca6\") " pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.317414 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtgw6\" (UniqueName: \"kubernetes.io/projected/395610a4-58ca-497e-93a6-714bd6c111c1-kube-api-access-qtgw6\") pod \"multus-cz9q7\" (UID: \"395610a4-58ca-497e-93a6-714bd6c111c1\") " pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.318701 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8xqs\" (UniqueName: \"kubernetes.io/projected/e68497b0-de41-4a06-a7ca-2944fded6bd9-kube-api-access-r8xqs\") pod \"multus-additional-cni-plugins-8lq2b\" (UID: \"e68497b0-de41-4a06-a7ca-2944fded6bd9\") " pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.319551 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.321792 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb266\" (UniqueName: \"kubernetes.io/projected/494d6b1a-0610-4a79-be5d-3c7e54f5c2eb-kube-api-access-tb266\") pod \"node-resolver-xhz4k\" (UID: \"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\") " pod="openshift-dns/node-resolver-xhz4k" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.335039 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xhz4k" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.341653 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.342121 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 12:47:15 crc kubenswrapper[4757]: W1216 12:47:15.350576 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod494d6b1a_0610_4a79_be5d_3c7e54f5c2eb.slice/crio-7e1c6d4aff4bc39a62f026e7ee62d789cda108b173413699d07f64996f2332a7 WatchSource:0}: Error finding container 7e1c6d4aff4bc39a62f026e7ee62d789cda108b173413699d07f64996f2332a7: Status 404 returned error can't find the container with id 7e1c6d4aff4bc39a62f026e7ee62d789cda108b173413699d07f64996f2332a7 Dec 16 12:47:15 crc kubenswrapper[4757]: W1216 12:47:15.360428 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43be7319_eac3_4e51_9560_e12d51e97ca6.slice/crio-0c866fa86a33ef394af59227bfc802a3e136a8dbab0342b30f7f31ca764842f2 WatchSource:0}: Error finding container 0c866fa86a33ef394af59227bfc802a3e136a8dbab0342b30f7f31ca764842f2: Status 404 returned error can't find the container with id 0c866fa86a33ef394af59227bfc802a3e136a8dbab0342b30f7f31ca764842f2 Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.360479 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cz9q7" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.365259 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: W1216 12:47:15.374265 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod395610a4_58ca_497e_93a6_714bd6c111c1.slice/crio-222817aaa36d29b5572c59849caa84b495406f83999546ff58657db990d77d32 WatchSource:0}: Error finding container 222817aaa36d29b5572c59849caa84b495406f83999546ff58657db990d77d32: Status 404 returned error can't find the container with id 222817aaa36d29b5572c59849caa84b495406f83999546ff58657db990d77d32 Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.380201 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.382429 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" Dec 16 12:47:15 crc kubenswrapper[4757]: W1216 12:47:15.401089 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode68497b0_de41_4a06_a7ca_2944fded6bd9.slice/crio-f74954ec6fc4b62d2821d6d2c993d009ec2a64adef3c1b7e6e85e049b6762060 WatchSource:0}: Error finding container f74954ec6fc4b62d2821d6d2c993d009ec2a64adef3c1b7e6e85e049b6762060: Status 404 returned error can't find the container with id f74954ec6fc4b62d2821d6d2c993d009ec2a64adef3c1b7e6e85e049b6762060 Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.401975 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.416959 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.428153 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.429695 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.431748 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.441127 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t465t"] Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.441854 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.449502 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.449715 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.449949 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.450125 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.450232 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.450529 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.450689 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.451215 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.477050 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.494136 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.503519 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b876e35b-75f8-407e-bf25-f7b3c2024428-ovnkube-script-lib\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.503561 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-slash\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.503582 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-var-lib-openvswitch\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.503600 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-run-systemd\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.503617 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c58k\" (UniqueName: \"kubernetes.io/projected/b876e35b-75f8-407e-bf25-f7b3c2024428-kube-api-access-9c58k\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.503638 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-kubelet\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.503656 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-etc-openvswitch\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.503675 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b876e35b-75f8-407e-bf25-f7b3c2024428-ovn-node-metrics-cert\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.503696 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-systemd-units\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.503715 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-run-ovn\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.503753 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-node-log\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.503801 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-log-socket\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.504408 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-cni-bin\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.504447 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-run-netns\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.504495 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.504525 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b876e35b-75f8-407e-bf25-f7b3c2024428-env-overrides\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.504544 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-run-openvswitch\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.504577 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-run-ovn-kubernetes\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.504601 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-cni-netd\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.504625 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b876e35b-75f8-407e-bf25-f7b3c2024428-ovnkube-config\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.513230 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.531389 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.551978 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.568815 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.587178 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.602518 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605157 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-node-log\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605199 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-log-socket\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605238 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-cni-bin\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605269 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-run-netns\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605294 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605317 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b876e35b-75f8-407e-bf25-f7b3c2024428-env-overrides\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605340 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-run-openvswitch\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605369 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-run-ovn-kubernetes\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605388 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-cni-netd\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605411 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b876e35b-75f8-407e-bf25-f7b3c2024428-ovnkube-config\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605430 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-slash\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605450 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-var-lib-openvswitch\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605469 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b876e35b-75f8-407e-bf25-f7b3c2024428-ovnkube-script-lib\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605493 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-run-systemd\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605514 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c58k\" (UniqueName: \"kubernetes.io/projected/b876e35b-75f8-407e-bf25-f7b3c2024428-kube-api-access-9c58k\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605534 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-kubelet\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605555 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-etc-openvswitch\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605577 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b876e35b-75f8-407e-bf25-f7b3c2024428-ovn-node-metrics-cert\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605596 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-systemd-units\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605616 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-run-ovn\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605701 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-run-ovn\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605745 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-node-log\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605773 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-log-socket\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605798 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-cni-bin\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605824 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-run-netns\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.605854 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.606145 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-run-systemd\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.606176 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-cni-netd\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.606191 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-slash\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.606213 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-run-openvswitch\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.606248 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-var-lib-openvswitch\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.606251 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-run-ovn-kubernetes\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.606256 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-systemd-units\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.606276 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-etc-openvswitch\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.606305 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-kubelet\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.606823 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b876e35b-75f8-407e-bf25-f7b3c2024428-ovnkube-config\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.607109 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b876e35b-75f8-407e-bf25-f7b3c2024428-env-overrides\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.607691 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b876e35b-75f8-407e-bf25-f7b3c2024428-ovnkube-script-lib\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.610774 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b876e35b-75f8-407e-bf25-f7b3c2024428-ovn-node-metrics-cert\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.627567 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c58k\" (UniqueName: \"kubernetes.io/projected/b876e35b-75f8-407e-bf25-f7b3c2024428-kube-api-access-9c58k\") pod \"ovnkube-node-t465t\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.630909 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.661531 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.722403 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.729454 4757 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.731121 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.731169 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.731183 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.731284 4757 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.738129 4757 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.738379 4757 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.739284 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.739850 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.739882 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.739894 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.739908 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.739920 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:15Z","lastTransitionTime":"2025-12-16T12:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.754834 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: E1216 12:47:15.756839 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.759737 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.759771 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.759782 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.759798 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.759809 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:15Z","lastTransitionTime":"2025-12-16T12:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.764175 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.766897 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: E1216 12:47:15.770430 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.773343 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.773379 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.773393 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.773409 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.773420 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:15Z","lastTransitionTime":"2025-12-16T12:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.783677 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: E1216 12:47:15.783930 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.786873 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.787034 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.787124 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.787224 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.787294 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:15Z","lastTransitionTime":"2025-12-16T12:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.796245 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: E1216 12:47:15.803169 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.806549 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.806590 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.806599 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.806619 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.806631 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:15Z","lastTransitionTime":"2025-12-16T12:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:15 crc kubenswrapper[4757]: E1216 12:47:15.826488 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:15 crc kubenswrapper[4757]: E1216 12:47:15.826774 4757 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.828375 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.828407 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.828414 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.828428 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.828437 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:15Z","lastTransitionTime":"2025-12-16T12:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:15 crc kubenswrapper[4757]: W1216 12:47:15.901437 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb876e35b_75f8_407e_bf25_f7b3c2024428.slice/crio-f431c0d2658ec360c04077cc43e3ea314cd54336b9aab2385e3d68efaec17c91 WatchSource:0}: Error finding container f431c0d2658ec360c04077cc43e3ea314cd54336b9aab2385e3d68efaec17c91: Status 404 returned error can't find the container with id f431c0d2658ec360c04077cc43e3ea314cd54336b9aab2385e3d68efaec17c91 Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.931222 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.931265 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.931275 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.931289 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:15 crc kubenswrapper[4757]: I1216 12:47:15.931298 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:15Z","lastTransitionTime":"2025-12-16T12:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.033183 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.033218 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.033227 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.033240 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.033254 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:16Z","lastTransitionTime":"2025-12-16T12:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.135155 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.135184 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.135193 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.135204 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.135213 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:16Z","lastTransitionTime":"2025-12-16T12:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.183135 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cz9q7" event={"ID":"395610a4-58ca-497e-93a6-714bd6c111c1","Type":"ContainerStarted","Data":"6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.183183 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cz9q7" event={"ID":"395610a4-58ca-497e-93a6-714bd6c111c1","Type":"ContainerStarted","Data":"222817aaa36d29b5572c59849caa84b495406f83999546ff58657db990d77d32"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.184803 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.184866 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.184878 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"0c866fa86a33ef394af59227bfc802a3e136a8dbab0342b30f7f31ca764842f2"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.186041 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xhz4k" event={"ID":"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb","Type":"ContainerStarted","Data":"d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.186088 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xhz4k" event={"ID":"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb","Type":"ContainerStarted","Data":"7e1c6d4aff4bc39a62f026e7ee62d789cda108b173413699d07f64996f2332a7"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.187550 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" event={"ID":"e68497b0-de41-4a06-a7ca-2944fded6bd9","Type":"ContainerStarted","Data":"9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.187581 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" event={"ID":"e68497b0-de41-4a06-a7ca-2944fded6bd9","Type":"ContainerStarted","Data":"f74954ec6fc4b62d2821d6d2c993d009ec2a64adef3c1b7e6e85e049b6762060"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.189206 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.190410 4757 generic.go:334] "Generic (PLEG): container finished" podID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerID="a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43" exitCode=0 Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.190467 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerDied","Data":"a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.190504 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerStarted","Data":"f431c0d2658ec360c04077cc43e3ea314cd54336b9aab2385e3d68efaec17c91"} Dec 16 12:47:16 crc kubenswrapper[4757]: E1216 12:47:16.196928 4757 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.209133 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.234528 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.237097 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.237138 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.237149 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.237165 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.237178 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:16Z","lastTransitionTime":"2025-12-16T12:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.251406 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.268976 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.288443 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.324960 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.341805 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.341839 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.341847 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.341865 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.341874 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:16Z","lastTransitionTime":"2025-12-16T12:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.392999 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.419260 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.443689 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.443768 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.443784 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.443800 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.443810 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:16Z","lastTransitionTime":"2025-12-16T12:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.447153 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.459667 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.474181 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.489120 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.515747 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.532843 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.547908 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.547950 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.547961 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.547975 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.547986 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:16Z","lastTransitionTime":"2025-12-16T12:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.556026 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.581290 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.605685 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.618547 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.635588 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.650884 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.650924 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.650934 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.650949 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.650960 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:16Z","lastTransitionTime":"2025-12-16T12:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.654890 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.669451 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.679076 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.693173 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.704155 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.715303 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.726109 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.739459 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.753302 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:16Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.753500 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.753528 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.753540 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.753555 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.753566 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:16Z","lastTransitionTime":"2025-12-16T12:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.817220 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.817345 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:16 crc kubenswrapper[4757]: E1216 12:47:16.817392 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:47:20.817351222 +0000 UTC m=+26.245095018 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:47:16 crc kubenswrapper[4757]: E1216 12:47:16.817451 4757 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.817474 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:16 crc kubenswrapper[4757]: E1216 12:47:16.817504 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:20.817488545 +0000 UTC m=+26.245232421 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 12:47:16 crc kubenswrapper[4757]: E1216 12:47:16.817603 4757 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 12:47:16 crc kubenswrapper[4757]: E1216 12:47:16.817652 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:20.817642949 +0000 UTC m=+26.245386825 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.856320 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.856346 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.856354 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.856367 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.856377 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:16Z","lastTransitionTime":"2025-12-16T12:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.918900 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.918950 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:16 crc kubenswrapper[4757]: E1216 12:47:16.919119 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 12:47:16 crc kubenswrapper[4757]: E1216 12:47:16.919138 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 12:47:16 crc kubenswrapper[4757]: E1216 12:47:16.919182 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 12:47:16 crc kubenswrapper[4757]: E1216 12:47:16.919152 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 12:47:16 crc kubenswrapper[4757]: E1216 12:47:16.919196 4757 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:16 crc kubenswrapper[4757]: E1216 12:47:16.919211 4757 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:16 crc kubenswrapper[4757]: E1216 12:47:16.919256 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:20.919240744 +0000 UTC m=+26.346984540 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:16 crc kubenswrapper[4757]: E1216 12:47:16.919285 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:20.919266134 +0000 UTC m=+26.347010000 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.949030 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.949064 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:16 crc kubenswrapper[4757]: E1216 12:47:16.949396 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.949122 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:16 crc kubenswrapper[4757]: E1216 12:47:16.949513 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:16 crc kubenswrapper[4757]: E1216 12:47:16.949579 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.958684 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.958722 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.958734 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.958750 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:16 crc kubenswrapper[4757]: I1216 12:47:16.958762 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:16Z","lastTransitionTime":"2025-12-16T12:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.061610 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.061637 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.061645 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.061657 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.061666 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:17Z","lastTransitionTime":"2025-12-16T12:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.164561 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.164684 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.164766 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.164847 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.164925 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:17Z","lastTransitionTime":"2025-12-16T12:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.194946 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerStarted","Data":"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302"} Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.195368 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerStarted","Data":"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874"} Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.195464 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerStarted","Data":"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151"} Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.195700 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerStarted","Data":"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154"} Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.196614 4757 generic.go:334] "Generic (PLEG): container finished" podID="e68497b0-de41-4a06-a7ca-2944fded6bd9" containerID="9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9" exitCode=0 Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.196740 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" event={"ID":"e68497b0-de41-4a06-a7ca-2944fded6bd9","Type":"ContainerDied","Data":"9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9"} Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.214228 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.238710 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.252246 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.264661 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.267626 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.267670 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.267682 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.267698 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.267707 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:17Z","lastTransitionTime":"2025-12-16T12:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.275760 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.291091 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.311785 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.324213 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.345163 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.361250 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.370475 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.370546 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.370554 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.370571 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.370580 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:17Z","lastTransitionTime":"2025-12-16T12:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.376572 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.387558 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.399116 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.411590 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.472382 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.472422 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.472430 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.472444 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.472454 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:17Z","lastTransitionTime":"2025-12-16T12:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.574664 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.574703 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.574714 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.574728 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.574739 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:17Z","lastTransitionTime":"2025-12-16T12:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.677131 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.677168 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.677178 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.677192 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.677203 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:17Z","lastTransitionTime":"2025-12-16T12:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.779818 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.779861 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.779874 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.779890 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.779901 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:17Z","lastTransitionTime":"2025-12-16T12:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.882150 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.882200 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.882213 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.882231 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.882242 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:17Z","lastTransitionTime":"2025-12-16T12:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.984560 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.984625 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.984638 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.984656 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:17 crc kubenswrapper[4757]: I1216 12:47:17.984668 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:17Z","lastTransitionTime":"2025-12-16T12:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.086876 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.086921 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.086932 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.086947 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.086960 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:18Z","lastTransitionTime":"2025-12-16T12:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.188887 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.188928 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.188937 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.188949 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.188958 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:18Z","lastTransitionTime":"2025-12-16T12:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.202238 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerStarted","Data":"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946"} Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.202296 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerStarted","Data":"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a"} Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.203477 4757 generic.go:334] "Generic (PLEG): container finished" podID="e68497b0-de41-4a06-a7ca-2944fded6bd9" containerID="1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8" exitCode=0 Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.203515 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" event={"ID":"e68497b0-de41-4a06-a7ca-2944fded6bd9","Type":"ContainerDied","Data":"1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8"} Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.220380 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.259034 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.293983 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.294249 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.294267 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.294275 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.294288 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.294298 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:18Z","lastTransitionTime":"2025-12-16T12:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.311705 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.335602 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.355129 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.371495 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.382855 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.393675 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.396073 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.396092 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.396100 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.396112 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.396121 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:18Z","lastTransitionTime":"2025-12-16T12:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.408728 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.423513 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.443442 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.447004 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-6wv7w"] Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.447355 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6wv7w" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.449554 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.449716 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.449941 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.450093 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.462854 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.481852 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.497890 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.498312 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.498336 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.498344 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.498356 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.498366 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:18Z","lastTransitionTime":"2025-12-16T12:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.516410 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.534483 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6e5d7a25-bcdb-4347-b67b-008b3a0c48f8-serviceca\") pod \"node-ca-6wv7w\" (UID: \"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\") " pod="openshift-image-registry/node-ca-6wv7w" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.534548 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e5d7a25-bcdb-4347-b67b-008b3a0c48f8-host\") pod \"node-ca-6wv7w\" (UID: \"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\") " pod="openshift-image-registry/node-ca-6wv7w" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.534725 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smpc6\" (UniqueName: \"kubernetes.io/projected/6e5d7a25-bcdb-4347-b67b-008b3a0c48f8-kube-api-access-smpc6\") pod \"node-ca-6wv7w\" (UID: \"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\") " pod="openshift-image-registry/node-ca-6wv7w" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.536944 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.548297 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.557991 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.574964 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.589493 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.600957 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.601023 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.601035 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.601049 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.601060 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:18Z","lastTransitionTime":"2025-12-16T12:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.603560 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.614419 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.626103 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.635776 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smpc6\" (UniqueName: \"kubernetes.io/projected/6e5d7a25-bcdb-4347-b67b-008b3a0c48f8-kube-api-access-smpc6\") pod \"node-ca-6wv7w\" (UID: \"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\") " pod="openshift-image-registry/node-ca-6wv7w" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.635817 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6e5d7a25-bcdb-4347-b67b-008b3a0c48f8-serviceca\") pod \"node-ca-6wv7w\" (UID: \"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\") " pod="openshift-image-registry/node-ca-6wv7w" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.635849 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e5d7a25-bcdb-4347-b67b-008b3a0c48f8-host\") pod \"node-ca-6wv7w\" (UID: \"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\") " pod="openshift-image-registry/node-ca-6wv7w" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.635913 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e5d7a25-bcdb-4347-b67b-008b3a0c48f8-host\") pod \"node-ca-6wv7w\" (UID: \"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\") " pod="openshift-image-registry/node-ca-6wv7w" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.637227 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6e5d7a25-bcdb-4347-b67b-008b3a0c48f8-serviceca\") pod \"node-ca-6wv7w\" (UID: \"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\") " pod="openshift-image-registry/node-ca-6wv7w" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.639694 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.652419 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.654752 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smpc6\" (UniqueName: \"kubernetes.io/projected/6e5d7a25-bcdb-4347-b67b-008b3a0c48f8-kube-api-access-smpc6\") pod \"node-ca-6wv7w\" (UID: \"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\") " pod="openshift-image-registry/node-ca-6wv7w" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.669162 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.679930 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.692726 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:18Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.703263 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.703290 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.703299 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.703314 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.703324 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:18Z","lastTransitionTime":"2025-12-16T12:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.763380 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6wv7w" Dec 16 12:47:18 crc kubenswrapper[4757]: W1216 12:47:18.777194 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e5d7a25_bcdb_4347_b67b_008b3a0c48f8.slice/crio-21684dcd58b7d0969ab215013fd9a741826d54a1dbd987f185271cb35afebe73 WatchSource:0}: Error finding container 21684dcd58b7d0969ab215013fd9a741826d54a1dbd987f185271cb35afebe73: Status 404 returned error can't find the container with id 21684dcd58b7d0969ab215013fd9a741826d54a1dbd987f185271cb35afebe73 Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.806380 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.806423 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.806435 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.806453 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.806466 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:18Z","lastTransitionTime":"2025-12-16T12:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.908598 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.908629 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.908639 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.908655 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.908702 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:18Z","lastTransitionTime":"2025-12-16T12:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.948700 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.948757 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:18 crc kubenswrapper[4757]: I1216 12:47:18.948710 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:18 crc kubenswrapper[4757]: E1216 12:47:18.948836 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:18 crc kubenswrapper[4757]: E1216 12:47:18.948895 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:18 crc kubenswrapper[4757]: E1216 12:47:18.948985 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.010941 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.010971 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.010979 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.010992 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.011004 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:19Z","lastTransitionTime":"2025-12-16T12:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.113833 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.113867 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.113876 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.113889 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.113899 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:19Z","lastTransitionTime":"2025-12-16T12:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.210103 4757 generic.go:334] "Generic (PLEG): container finished" podID="e68497b0-de41-4a06-a7ca-2944fded6bd9" containerID="6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5" exitCode=0 Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.210184 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" event={"ID":"e68497b0-de41-4a06-a7ca-2944fded6bd9","Type":"ContainerDied","Data":"6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5"} Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.211315 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6wv7w" event={"ID":"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8","Type":"ContainerStarted","Data":"71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85"} Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.211351 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6wv7w" event={"ID":"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8","Type":"ContainerStarted","Data":"21684dcd58b7d0969ab215013fd9a741826d54a1dbd987f185271cb35afebe73"} Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.215494 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.215538 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.215554 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.215573 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.215599 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:19Z","lastTransitionTime":"2025-12-16T12:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.223805 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.238088 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.254447 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.269214 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.289481 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.309879 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.322964 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.323050 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.323061 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.323081 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.323102 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:19Z","lastTransitionTime":"2025-12-16T12:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.323685 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.333691 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.344477 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.355844 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.367142 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.380576 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.395161 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.410621 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.426261 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.426290 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.426299 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.426313 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.426322 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:19Z","lastTransitionTime":"2025-12-16T12:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.426845 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.438091 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.450267 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.468504 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.482219 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.492681 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.506790 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.523179 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.528280 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.528316 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.528324 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.528339 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.528349 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:19Z","lastTransitionTime":"2025-12-16T12:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.540458 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.554992 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.567739 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.580786 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.593898 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.630287 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.631213 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.631254 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.631265 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.631284 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.631295 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:19Z","lastTransitionTime":"2025-12-16T12:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.676478 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.697199 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:19Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.734182 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.734238 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.734251 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.734269 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.734283 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:19Z","lastTransitionTime":"2025-12-16T12:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.836656 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.836700 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.836713 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.836730 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.836750 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:19Z","lastTransitionTime":"2025-12-16T12:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.939494 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.939799 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.939809 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.939825 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:19 crc kubenswrapper[4757]: I1216 12:47:19.939834 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:19Z","lastTransitionTime":"2025-12-16T12:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.042020 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.042068 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.042080 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.042099 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.042111 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:20Z","lastTransitionTime":"2025-12-16T12:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.144128 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.144164 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.144174 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.144189 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.144200 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:20Z","lastTransitionTime":"2025-12-16T12:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.218856 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerStarted","Data":"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36"} Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.222207 4757 generic.go:334] "Generic (PLEG): container finished" podID="e68497b0-de41-4a06-a7ca-2944fded6bd9" containerID="04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d" exitCode=0 Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.222280 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" event={"ID":"e68497b0-de41-4a06-a7ca-2944fded6bd9","Type":"ContainerDied","Data":"04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d"} Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.235743 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:20Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.246579 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.246614 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.246624 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.246639 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.246650 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:20Z","lastTransitionTime":"2025-12-16T12:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.250962 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:20Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.265664 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:20Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.275392 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:20Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.286927 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:20Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.296760 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:20Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.308838 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:20Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.328428 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:20Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.342071 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:20Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.348440 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.348478 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.348488 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.348501 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.348510 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:20Z","lastTransitionTime":"2025-12-16T12:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.356036 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:20Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.369238 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:20Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.380907 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:20Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.413690 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:20Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.429799 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:20Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.449875 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:20Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.451386 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.451422 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.451434 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.451449 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.451459 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:20Z","lastTransitionTime":"2025-12-16T12:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.554208 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.554290 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.554303 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.554318 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.554327 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:20Z","lastTransitionTime":"2025-12-16T12:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.659582 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.659612 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.659622 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.659636 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.659646 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:20Z","lastTransitionTime":"2025-12-16T12:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.762792 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.762820 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.762829 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.762844 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.762853 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:20Z","lastTransitionTime":"2025-12-16T12:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.857421 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.857565 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.857601 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:20 crc kubenswrapper[4757]: E1216 12:47:20.857697 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:47:28.857680885 +0000 UTC m=+34.285424681 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:47:20 crc kubenswrapper[4757]: E1216 12:47:20.857734 4757 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 12:47:20 crc kubenswrapper[4757]: E1216 12:47:20.857743 4757 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 12:47:20 crc kubenswrapper[4757]: E1216 12:47:20.857774 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:28.857765887 +0000 UTC m=+34.285509683 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 12:47:20 crc kubenswrapper[4757]: E1216 12:47:20.857785 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:28.857780327 +0000 UTC m=+34.285524123 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.864635 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.864699 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.864711 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.864725 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.864735 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:20Z","lastTransitionTime":"2025-12-16T12:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.948675 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:20 crc kubenswrapper[4757]: E1216 12:47:20.948819 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.949264 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:20 crc kubenswrapper[4757]: E1216 12:47:20.949324 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.949383 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:20 crc kubenswrapper[4757]: E1216 12:47:20.949440 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.958965 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.959002 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:20 crc kubenswrapper[4757]: E1216 12:47:20.959142 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 12:47:20 crc kubenswrapper[4757]: E1216 12:47:20.959147 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 12:47:20 crc kubenswrapper[4757]: E1216 12:47:20.959175 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 12:47:20 crc kubenswrapper[4757]: E1216 12:47:20.959191 4757 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:20 crc kubenswrapper[4757]: E1216 12:47:20.959158 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 12:47:20 crc kubenswrapper[4757]: E1216 12:47:20.959243 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:28.95922312 +0000 UTC m=+34.386967026 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:20 crc kubenswrapper[4757]: E1216 12:47:20.959249 4757 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:20 crc kubenswrapper[4757]: E1216 12:47:20.959304 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:28.959290481 +0000 UTC m=+34.387034277 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.966732 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.966774 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.966784 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.966798 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:20 crc kubenswrapper[4757]: I1216 12:47:20.966808 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:20Z","lastTransitionTime":"2025-12-16T12:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.069101 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.069188 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.069231 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.069261 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.069282 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:21Z","lastTransitionTime":"2025-12-16T12:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.171306 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.171354 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.171364 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.171385 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.171396 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:21Z","lastTransitionTime":"2025-12-16T12:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.232039 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerStarted","Data":"e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f"} Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.232617 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.232665 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.239308 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" event={"ID":"e68497b0-de41-4a06-a7ca-2944fded6bd9","Type":"ContainerStarted","Data":"1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6"} Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.254770 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.268790 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.273829 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.273875 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.273893 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.273910 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.273921 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:21Z","lastTransitionTime":"2025-12-16T12:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.285137 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.299210 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.300108 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.300391 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.315967 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.329531 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.342548 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.357788 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.372559 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.375887 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.375927 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.375938 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.375954 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.375964 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:21Z","lastTransitionTime":"2025-12-16T12:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.385964 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.397385 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.410239 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.419798 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.432772 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.447497 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.473568 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.478751 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.478866 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.478881 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.478896 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.478925 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:21Z","lastTransitionTime":"2025-12-16T12:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.493427 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.509418 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.524263 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.542154 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.561523 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.573981 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.581852 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.581900 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.581909 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.581922 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.581932 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:21Z","lastTransitionTime":"2025-12-16T12:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.591525 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.601133 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.612760 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.629803 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.662783 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.674108 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.683911 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.683949 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.683958 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.683971 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.683980 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:21Z","lastTransitionTime":"2025-12-16T12:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.686652 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.698647 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.712968 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:21Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.786521 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.786568 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.786581 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.786600 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.786614 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:21Z","lastTransitionTime":"2025-12-16T12:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.888803 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.888848 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.888859 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.888871 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.888880 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:21Z","lastTransitionTime":"2025-12-16T12:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.991707 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.991740 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.991750 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.991764 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:21 crc kubenswrapper[4757]: I1216 12:47:21.991773 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:21Z","lastTransitionTime":"2025-12-16T12:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.093792 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.093839 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.093849 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.093865 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.093877 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:22Z","lastTransitionTime":"2025-12-16T12:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.195873 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.195943 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.195960 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.195973 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.195981 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:22Z","lastTransitionTime":"2025-12-16T12:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.255651 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:22Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.268942 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:22Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.281217 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:22Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.293948 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:22Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.298647 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.298792 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.298869 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.298964 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.299064 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:22Z","lastTransitionTime":"2025-12-16T12:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.310410 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:22Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.335170 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:22Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.357710 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:22Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.369294 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:22Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.380083 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:22Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.389083 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:22Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.401111 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.401156 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.401166 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.401181 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.401191 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:22Z","lastTransitionTime":"2025-12-16T12:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.402861 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:22Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.414665 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:22Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.428618 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:22Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.443352 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:22Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.457147 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:22Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.504024 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.504067 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.504077 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.504093 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.504107 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:22Z","lastTransitionTime":"2025-12-16T12:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.606342 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.606382 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.606391 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.606404 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.606412 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:22Z","lastTransitionTime":"2025-12-16T12:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.708909 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.708951 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.708960 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.708981 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.708991 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:22Z","lastTransitionTime":"2025-12-16T12:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.811211 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.811513 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.811575 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.811635 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.811701 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:22Z","lastTransitionTime":"2025-12-16T12:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.913422 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.913482 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.913493 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.913509 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.913519 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:22Z","lastTransitionTime":"2025-12-16T12:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.948832 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.948857 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:22 crc kubenswrapper[4757]: I1216 12:47:22.948828 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:22 crc kubenswrapper[4757]: E1216 12:47:22.948948 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:22 crc kubenswrapper[4757]: E1216 12:47:22.949059 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:22 crc kubenswrapper[4757]: E1216 12:47:22.949190 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.015796 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.016264 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.016327 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.016386 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.016443 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:23Z","lastTransitionTime":"2025-12-16T12:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.120771 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.120808 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.120817 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.120831 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.120853 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:23Z","lastTransitionTime":"2025-12-16T12:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.223399 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.223440 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.223452 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.223472 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.223484 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:23Z","lastTransitionTime":"2025-12-16T12:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.247153 4757 generic.go:334] "Generic (PLEG): container finished" podID="e68497b0-de41-4a06-a7ca-2944fded6bd9" containerID="1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6" exitCode=0 Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.247550 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" event={"ID":"e68497b0-de41-4a06-a7ca-2944fded6bd9","Type":"ContainerDied","Data":"1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6"} Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.259728 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:23Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.272282 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:23Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.282832 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:23Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.293643 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:23Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.308302 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:23Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.326475 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.326525 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.326539 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.326558 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.326574 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:23Z","lastTransitionTime":"2025-12-16T12:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.328332 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:23Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.428916 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.428967 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.428985 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.429033 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.429050 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:23Z","lastTransitionTime":"2025-12-16T12:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.482309 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:23Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.496365 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:23Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.508893 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:23Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.520535 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:23Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.532118 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.532156 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.532164 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.532177 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.532185 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:23Z","lastTransitionTime":"2025-12-16T12:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.535397 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:23Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.548791 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:23Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.560166 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:23Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.571980 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:23Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.589101 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:23Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.634172 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.634210 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.634222 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.634239 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.634252 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:23Z","lastTransitionTime":"2025-12-16T12:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.736157 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.736200 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.736209 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.736223 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.736233 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:23Z","lastTransitionTime":"2025-12-16T12:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.838181 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.838215 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.838225 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.838244 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.838254 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:23Z","lastTransitionTime":"2025-12-16T12:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.940690 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.940733 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.940745 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.940760 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:23 crc kubenswrapper[4757]: I1216 12:47:23.940772 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:23Z","lastTransitionTime":"2025-12-16T12:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.043616 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.043657 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.043668 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.043686 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.043697 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:24Z","lastTransitionTime":"2025-12-16T12:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.146241 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.146276 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.146285 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.146299 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.146309 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:24Z","lastTransitionTime":"2025-12-16T12:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.248931 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.248959 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.248971 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.248985 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.248995 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:24Z","lastTransitionTime":"2025-12-16T12:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.252688 4757 generic.go:334] "Generic (PLEG): container finished" podID="e68497b0-de41-4a06-a7ca-2944fded6bd9" containerID="c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f" exitCode=0 Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.252725 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" event={"ID":"e68497b0-de41-4a06-a7ca-2944fded6bd9","Type":"ContainerDied","Data":"c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f"} Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.270859 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:24Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.297782 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:24Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.311363 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:24Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.321989 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:24Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.334873 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:24Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.347961 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:24Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.353892 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.353934 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.353944 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.353959 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.353975 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:24Z","lastTransitionTime":"2025-12-16T12:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.360040 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:24Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.386565 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:24Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.449529 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:24Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.456067 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.456098 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.456107 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.456121 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.456129 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:24Z","lastTransitionTime":"2025-12-16T12:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.461572 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:24Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.475962 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:24Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.488392 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:24Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.503161 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:24Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.515326 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:24Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.528481 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:24Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.558219 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.558258 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.558269 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.558283 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.558293 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:24Z","lastTransitionTime":"2025-12-16T12:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.660679 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.660713 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.660722 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.660735 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.660744 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:24Z","lastTransitionTime":"2025-12-16T12:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.763732 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.763776 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.763787 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.763805 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.763816 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:24Z","lastTransitionTime":"2025-12-16T12:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.866429 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.866466 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.866475 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.866489 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.866502 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:24Z","lastTransitionTime":"2025-12-16T12:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.949170 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:24 crc kubenswrapper[4757]: E1216 12:47:24.949284 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.949538 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:24 crc kubenswrapper[4757]: E1216 12:47:24.949595 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.949713 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:24 crc kubenswrapper[4757]: E1216 12:47:24.949766 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.968293 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.968318 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.968326 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.968338 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.968346 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:24Z","lastTransitionTime":"2025-12-16T12:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.968812 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:24Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:24 crc kubenswrapper[4757]: I1216 12:47:24.989865 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:24Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.004546 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.022151 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.043833 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.062873 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.070571 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.070605 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.070614 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.070626 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.070635 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:25Z","lastTransitionTime":"2025-12-16T12:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.078307 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.087763 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.103754 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.122923 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.136495 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.154946 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.168768 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.173578 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.173625 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.173638 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.173656 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.173667 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:25Z","lastTransitionTime":"2025-12-16T12:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.185531 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.201155 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.260459 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" event={"ID":"e68497b0-de41-4a06-a7ca-2944fded6bd9","Type":"ContainerStarted","Data":"bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d"} Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.272389 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.276400 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.276446 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.276466 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.276485 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.276519 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:25Z","lastTransitionTime":"2025-12-16T12:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.283319 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.297103 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.310265 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.322683 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.334607 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.348228 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.363908 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.377869 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.378786 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.378828 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.378838 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.378853 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.378865 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:25Z","lastTransitionTime":"2025-12-16T12:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.391229 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.407273 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.421414 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.441885 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.464388 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.480341 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.481341 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.481396 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.481405 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.481421 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.481430 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:25Z","lastTransitionTime":"2025-12-16T12:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.587360 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.587398 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.587415 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.587431 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.587441 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:25Z","lastTransitionTime":"2025-12-16T12:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.689555 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.689591 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.689601 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.689614 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.689623 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:25Z","lastTransitionTime":"2025-12-16T12:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.792821 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.792860 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.792875 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.792895 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.792910 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:25Z","lastTransitionTime":"2025-12-16T12:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.860517 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.860555 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.860564 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.860582 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.860592 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:25Z","lastTransitionTime":"2025-12-16T12:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:25 crc kubenswrapper[4757]: E1216 12:47:25.872890 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.876566 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.876670 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.876697 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.876723 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.876745 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:25Z","lastTransitionTime":"2025-12-16T12:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:25 crc kubenswrapper[4757]: E1216 12:47:25.890923 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.895707 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.895776 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.895794 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.895819 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.895838 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:25Z","lastTransitionTime":"2025-12-16T12:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:25 crc kubenswrapper[4757]: E1216 12:47:25.910065 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.914073 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.914120 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.914130 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.914187 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.914200 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:25Z","lastTransitionTime":"2025-12-16T12:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:25 crc kubenswrapper[4757]: E1216 12:47:25.927223 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.931044 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.931177 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.931253 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.931322 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.931389 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:25Z","lastTransitionTime":"2025-12-16T12:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:25 crc kubenswrapper[4757]: E1216 12:47:25.944905 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:25Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:25 crc kubenswrapper[4757]: E1216 12:47:25.945055 4757 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.946764 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.946793 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.946805 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.946820 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:25 crc kubenswrapper[4757]: I1216 12:47:25.946831 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:25Z","lastTransitionTime":"2025-12-16T12:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.049511 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.049553 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.049563 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.049582 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.049609 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:26Z","lastTransitionTime":"2025-12-16T12:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.152069 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.152114 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.152124 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.152138 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.152149 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:26Z","lastTransitionTime":"2025-12-16T12:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.254280 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.254321 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.254331 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.254345 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.254354 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:26Z","lastTransitionTime":"2025-12-16T12:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.264744 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovnkube-controller/0.log" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.267574 4757 generic.go:334] "Generic (PLEG): container finished" podID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerID="e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f" exitCode=1 Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.267629 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerDied","Data":"e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f"} Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.268284 4757 scope.go:117] "RemoveContainer" containerID="e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.282773 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:26Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.299788 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:26Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.313296 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:26Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.325619 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:26Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.337693 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:26Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.359477 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.359989 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.360073 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.360093 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.360104 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:26Z","lastTransitionTime":"2025-12-16T12:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.360335 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"or removal\\\\nI1216 12:47:25.517908 5971 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 12:47:25.517915 5971 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 12:47:25.517927 5971 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 12:47:25.517945 5971 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1216 12:47:25.517957 5971 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1216 12:47:25.517980 5971 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 12:47:25.517988 5971 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 12:47:25.517997 5971 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 12:47:25.518025 5971 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1216 12:47:25.518024 5971 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 12:47:25.518039 5971 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 12:47:25.518048 5971 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 12:47:25.518054 5971 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 12:47:25.518434 5971 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 12:47:25.518482 5971 factory.go:656] Stopping watch factory\\\\nI1216 12:47:25.518494 5971 ovnkube.go:599] Stopped ovnkube\\\\nI1216 12:47:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:26Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.378897 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:26Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.391131 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:26Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.405138 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:26Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.414866 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:26Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.425194 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:26Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.437794 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:26Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.450448 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:26Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.462396 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.462437 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.462448 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.462463 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.462474 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:26Z","lastTransitionTime":"2025-12-16T12:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.462919 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:26Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.478564 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:26Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.564849 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.564906 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.564918 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.564932 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.564944 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:26Z","lastTransitionTime":"2025-12-16T12:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.668069 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.668114 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.668130 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.668146 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.668179 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:26Z","lastTransitionTime":"2025-12-16T12:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.771121 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.771161 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.771170 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.771185 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.771194 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:26Z","lastTransitionTime":"2025-12-16T12:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.873223 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.873256 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.873265 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.873282 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.873293 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:26Z","lastTransitionTime":"2025-12-16T12:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.948846 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:26 crc kubenswrapper[4757]: E1216 12:47:26.948960 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.949340 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:26 crc kubenswrapper[4757]: E1216 12:47:26.949389 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.949432 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:26 crc kubenswrapper[4757]: E1216 12:47:26.949472 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.976138 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.976176 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.976187 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.976241 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:26 crc kubenswrapper[4757]: I1216 12:47:26.976259 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:26Z","lastTransitionTime":"2025-12-16T12:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.078204 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.078391 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.078403 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.078418 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.078429 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:27Z","lastTransitionTime":"2025-12-16T12:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.163049 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.180544 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.180583 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.180595 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.180610 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.180621 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:27Z","lastTransitionTime":"2025-12-16T12:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.184496 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.196745 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.216093 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.226338 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.236314 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.246458 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.275024 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovnkube-controller/0.log" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.276154 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.277831 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerStarted","Data":"043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae"} Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.278298 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.282121 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.282156 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.282166 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.282181 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.282193 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:27Z","lastTransitionTime":"2025-12-16T12:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.291635 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.294902 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c"] Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.295388 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.297282 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.297321 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.309484 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.324601 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.329079 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f8c859be-7650-49fa-a810-1bd096153c33-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-s8t5c\" (UID: \"f8c859be-7650-49fa-a810-1bd096153c33\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.329160 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f8c859be-7650-49fa-a810-1bd096153c33-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-s8t5c\" (UID: \"f8c859be-7650-49fa-a810-1bd096153c33\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.329211 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85fnf\" (UniqueName: \"kubernetes.io/projected/f8c859be-7650-49fa-a810-1bd096153c33-kube-api-access-85fnf\") pod \"ovnkube-control-plane-749d76644c-s8t5c\" (UID: \"f8c859be-7650-49fa-a810-1bd096153c33\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.329229 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f8c859be-7650-49fa-a810-1bd096153c33-env-overrides\") pod \"ovnkube-control-plane-749d76644c-s8t5c\" (UID: \"f8c859be-7650-49fa-a810-1bd096153c33\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.342067 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.352574 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.370835 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.383299 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.383915 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.383946 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.383954 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.383970 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.383979 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:27Z","lastTransitionTime":"2025-12-16T12:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.402676 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"or removal\\\\nI1216 12:47:25.517908 5971 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 12:47:25.517915 5971 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 12:47:25.517927 5971 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 12:47:25.517945 5971 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1216 12:47:25.517957 5971 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1216 12:47:25.517980 5971 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 12:47:25.517988 5971 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 12:47:25.517997 5971 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 12:47:25.518025 5971 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1216 12:47:25.518024 5971 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 12:47:25.518039 5971 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 12:47:25.518048 5971 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 12:47:25.518054 5971 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 12:47:25.518434 5971 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 12:47:25.518482 5971 factory.go:656] Stopping watch factory\\\\nI1216 12:47:25.518494 5971 ovnkube.go:599] Stopped ovnkube\\\\nI1216 12:47:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.417712 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.430059 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85fnf\" (UniqueName: \"kubernetes.io/projected/f8c859be-7650-49fa-a810-1bd096153c33-kube-api-access-85fnf\") pod \"ovnkube-control-plane-749d76644c-s8t5c\" (UID: \"f8c859be-7650-49fa-a810-1bd096153c33\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.430111 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f8c859be-7650-49fa-a810-1bd096153c33-env-overrides\") pod \"ovnkube-control-plane-749d76644c-s8t5c\" (UID: \"f8c859be-7650-49fa-a810-1bd096153c33\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.430175 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f8c859be-7650-49fa-a810-1bd096153c33-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-s8t5c\" (UID: \"f8c859be-7650-49fa-a810-1bd096153c33\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.431120 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f8c859be-7650-49fa-a810-1bd096153c33-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-s8t5c\" (UID: \"f8c859be-7650-49fa-a810-1bd096153c33\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.431183 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f8c859be-7650-49fa-a810-1bd096153c33-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-s8t5c\" (UID: \"f8c859be-7650-49fa-a810-1bd096153c33\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.431129 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f8c859be-7650-49fa-a810-1bd096153c33-env-overrides\") pod \"ovnkube-control-plane-749d76644c-s8t5c\" (UID: \"f8c859be-7650-49fa-a810-1bd096153c33\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.433086 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.437701 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f8c859be-7650-49fa-a810-1bd096153c33-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-s8t5c\" (UID: \"f8c859be-7650-49fa-a810-1bd096153c33\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.445260 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.450762 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85fnf\" (UniqueName: \"kubernetes.io/projected/f8c859be-7650-49fa-a810-1bd096153c33-kube-api-access-85fnf\") pod \"ovnkube-control-plane-749d76644c-s8t5c\" (UID: \"f8c859be-7650-49fa-a810-1bd096153c33\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.457154 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.466470 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.474311 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.485345 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.486431 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.486486 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.486500 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.486518 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.486530 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:27Z","lastTransitionTime":"2025-12-16T12:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.499032 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.512158 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.575387 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.585365 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.588683 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.588872 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.588936 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.589037 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.589126 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:27Z","lastTransitionTime":"2025-12-16T12:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.597064 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.609338 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" Dec 16 12:47:27 crc kubenswrapper[4757]: W1216 12:47:27.625026 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8c859be_7650_49fa_a810_1bd096153c33.slice/crio-4a7def141aaab99c52fab7ecb6a045a5dd8c6543981631f391e7070474e584a8 WatchSource:0}: Error finding container 4a7def141aaab99c52fab7ecb6a045a5dd8c6543981631f391e7070474e584a8: Status 404 returned error can't find the container with id 4a7def141aaab99c52fab7ecb6a045a5dd8c6543981631f391e7070474e584a8 Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.625404 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.645429 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.662403 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"or removal\\\\nI1216 12:47:25.517908 5971 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 12:47:25.517915 5971 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 12:47:25.517927 5971 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 12:47:25.517945 5971 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1216 12:47:25.517957 5971 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1216 12:47:25.517980 5971 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 12:47:25.517988 5971 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 12:47:25.517997 5971 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 12:47:25.518025 5971 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1216 12:47:25.518024 5971 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 12:47:25.518039 5971 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 12:47:25.518048 5971 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 12:47:25.518054 5971 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 12:47:25.518434 5971 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 12:47:25.518482 5971 factory.go:656] Stopping watch factory\\\\nI1216 12:47:25.518494 5971 ovnkube.go:599] Stopped ovnkube\\\\nI1216 12:47:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.675410 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8c859be-7650-49fa-a810-1bd096153c33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8t5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:27Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.691127 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.691173 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.691197 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.691211 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.691221 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:27Z","lastTransitionTime":"2025-12-16T12:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.794853 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.794914 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.794931 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.794954 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.794971 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:27Z","lastTransitionTime":"2025-12-16T12:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.897621 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.897658 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.897667 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.897683 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:27 crc kubenswrapper[4757]: I1216 12:47:27.897694 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:27Z","lastTransitionTime":"2025-12-16T12:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.002609 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.002649 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.002659 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.002690 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.002699 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:28Z","lastTransitionTime":"2025-12-16T12:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.105254 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.105285 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.105293 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.105305 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.105317 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:28Z","lastTransitionTime":"2025-12-16T12:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.209174 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.209216 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.209228 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.209243 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.209255 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:28Z","lastTransitionTime":"2025-12-16T12:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.282453 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" event={"ID":"f8c859be-7650-49fa-a810-1bd096153c33","Type":"ContainerStarted","Data":"b41e73e0e8e60e28a121a58c2c062fd5a9c0511b830575351f44047d1fe49b64"} Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.282503 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" event={"ID":"f8c859be-7650-49fa-a810-1bd096153c33","Type":"ContainerStarted","Data":"4a7def141aaab99c52fab7ecb6a045a5dd8c6543981631f391e7070474e584a8"} Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.287923 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovnkube-controller/1.log" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.289524 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovnkube-controller/0.log" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.298219 4757 generic.go:334] "Generic (PLEG): container finished" podID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerID="043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae" exitCode=1 Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.298259 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerDied","Data":"043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae"} Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.298308 4757 scope.go:117] "RemoveContainer" containerID="e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.298916 4757 scope.go:117] "RemoveContainer" containerID="043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae" Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.299112 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-t465t_openshift-ovn-kubernetes(b876e35b-75f8-407e-bf25-f7b3c2024428)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.311025 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.311320 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.311389 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.311402 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.311418 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.311430 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:28Z","lastTransitionTime":"2025-12-16T12:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.322920 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.347447 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.360883 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.385606 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.403885 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.413818 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.413872 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.413884 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.413900 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.413911 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:28Z","lastTransitionTime":"2025-12-16T12:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.424718 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"or removal\\\\nI1216 12:47:25.517908 5971 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 12:47:25.517915 5971 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 12:47:25.517927 5971 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 12:47:25.517945 5971 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1216 12:47:25.517957 5971 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1216 12:47:25.517980 5971 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 12:47:25.517988 5971 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 12:47:25.517997 5971 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 12:47:25.518025 5971 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1216 12:47:25.518024 5971 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 12:47:25.518039 5971 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 12:47:25.518048 5971 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 12:47:25.518054 5971 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 12:47:25.518434 5971 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 12:47:25.518482 5971 factory.go:656] Stopping watch factory\\\\nI1216 12:47:25.518494 5971 ovnkube.go:599] Stopped ovnkube\\\\nI1216 12:47:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:47:27.328600 6181 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:47:27.328858 6181 obj_retry.go:551] Creating *factory.egressNode crc took: 1.819194ms\\\\nI1216 12:47:27.328881 6181 factory.go:1336] Added *v1.Node event handler 7\\\\nI1216 12:47:27.328911 6181 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1216 12:47:27.329146 6181 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1216 12:47:27.329235 6181 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1216 12:47:27.329268 6181 ovnkube.go:599] Stopped ovnkube\\\\nI1216 12:47:27.329294 6181 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1216 12:47:27.329360 6181 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.438470 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8c859be-7650-49fa-a810-1bd096153c33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8t5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.452721 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.466414 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.478346 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.490149 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.500788 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.510533 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.515650 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.515688 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.515700 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.515716 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.515728 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:28Z","lastTransitionTime":"2025-12-16T12:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.522924 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.540474 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.618144 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.618293 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.618366 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.618439 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.618508 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:28Z","lastTransitionTime":"2025-12-16T12:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.721226 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.721674 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.721778 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.721882 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.721955 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:28Z","lastTransitionTime":"2025-12-16T12:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.750171 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-k6rww"] Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.750851 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.750991 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.762618 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.776783 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.790286 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.804241 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.821362 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.823905 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.823939 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.823951 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.823964 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.823975 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:28Z","lastTransitionTime":"2025-12-16T12:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.834720 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.845088 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k6rww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k6rww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.856794 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8c859be-7650-49fa-a810-1bd096153c33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8t5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.877384 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.878163 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.878248 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:47:44.878229459 +0000 UTC m=+50.305973265 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.878275 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.878342 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs\") pod \"network-metrics-daemon-k6rww\" (UID: \"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\") " pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.878368 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.878394 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2r7b\" (UniqueName: \"kubernetes.io/projected/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-kube-api-access-x2r7b\") pod \"network-metrics-daemon-k6rww\" (UID: \"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\") " pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.878404 4757 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.878451 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:44.878440044 +0000 UTC m=+50.306183840 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.878490 4757 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.878521 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:44.878512126 +0000 UTC m=+50.306255922 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.892264 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.912203 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"or removal\\\\nI1216 12:47:25.517908 5971 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 12:47:25.517915 5971 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 12:47:25.517927 5971 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 12:47:25.517945 5971 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1216 12:47:25.517957 5971 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1216 12:47:25.517980 5971 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 12:47:25.517988 5971 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 12:47:25.517997 5971 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 12:47:25.518025 5971 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1216 12:47:25.518024 5971 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 12:47:25.518039 5971 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 12:47:25.518048 5971 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 12:47:25.518054 5971 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 12:47:25.518434 5971 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 12:47:25.518482 5971 factory.go:656] Stopping watch factory\\\\nI1216 12:47:25.518494 5971 ovnkube.go:599] Stopped ovnkube\\\\nI1216 12:47:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:47:27.328600 6181 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:47:27.328858 6181 obj_retry.go:551] Creating *factory.egressNode crc took: 1.819194ms\\\\nI1216 12:47:27.328881 6181 factory.go:1336] Added *v1.Node event handler 7\\\\nI1216 12:47:27.328911 6181 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1216 12:47:27.329146 6181 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1216 12:47:27.329235 6181 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1216 12:47:27.329268 6181 ovnkube.go:599] Stopped ovnkube\\\\nI1216 12:47:27.329294 6181 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1216 12:47:27.329360 6181 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.923563 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.926566 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.926603 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.926612 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.926624 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.926634 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:28Z","lastTransitionTime":"2025-12-16T12:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.937559 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.948357 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.948420 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.948492 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.948542 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.948681 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.948735 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.948859 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.961724 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.971974 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.979264 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.979326 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.979364 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs\") pod \"network-metrics-daemon-k6rww\" (UID: \"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\") " pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.979413 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2r7b\" (UniqueName: \"kubernetes.io/projected/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-kube-api-access-x2r7b\") pod \"network-metrics-daemon-k6rww\" (UID: \"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\") " pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.979481 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.979511 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.979523 4757 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.979575 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:44.979555268 +0000 UTC m=+50.407299064 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.979792 4757 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.979829 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs podName:0c1b0cca-3853-4bcf-8389-2fa9c754b5e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:29.479818624 +0000 UTC m=+34.907562420 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs") pod "network-metrics-daemon-k6rww" (UID: "0c1b0cca-3853-4bcf-8389-2fa9c754b5e8") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.979966 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.980991 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.981160 4757 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:28 crc kubenswrapper[4757]: E1216 12:47:28.981285 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:44.981266159 +0000 UTC m=+50.409009955 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.984359 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:28Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:28 crc kubenswrapper[4757]: I1216 12:47:28.995504 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2r7b\" (UniqueName: \"kubernetes.io/projected/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-kube-api-access-x2r7b\") pod \"network-metrics-daemon-k6rww\" (UID: \"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\") " pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.029464 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.029499 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.029510 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.029522 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.029532 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:29Z","lastTransitionTime":"2025-12-16T12:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.131878 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.132470 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.132628 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.132772 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.132908 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:29Z","lastTransitionTime":"2025-12-16T12:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.235671 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.235909 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.235996 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.236159 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.236272 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:29Z","lastTransitionTime":"2025-12-16T12:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.302856 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" event={"ID":"f8c859be-7650-49fa-a810-1bd096153c33","Type":"ContainerStarted","Data":"b4319cc559caee7402a63a08e2e1136722b2dbec25d2388f64f7f1d55e46dde1"} Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.304671 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovnkube-controller/1.log" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.309144 4757 scope.go:117] "RemoveContainer" containerID="043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae" Dec 16 12:47:29 crc kubenswrapper[4757]: E1216 12:47:29.309296 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-t465t_openshift-ovn-kubernetes(b876e35b-75f8-407e-bf25-f7b3c2024428)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.317042 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.326847 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k6rww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k6rww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.337491 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.338604 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.338715 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.338777 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.338844 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.338899 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:29Z","lastTransitionTime":"2025-12-16T12:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.350630 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.361949 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.375269 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.394409 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e16c740cf3a051ca0cd4783d5428fa000dc89e270968ea5dcb9e6b2aa9cb584f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"message\\\":\\\"or removal\\\\nI1216 12:47:25.517908 5971 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 12:47:25.517915 5971 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 12:47:25.517927 5971 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 12:47:25.517945 5971 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1216 12:47:25.517957 5971 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1216 12:47:25.517980 5971 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 12:47:25.517988 5971 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 12:47:25.517997 5971 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 12:47:25.518025 5971 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1216 12:47:25.518024 5971 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 12:47:25.518039 5971 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 12:47:25.518048 5971 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 12:47:25.518054 5971 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 12:47:25.518434 5971 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 12:47:25.518482 5971 factory.go:656] Stopping watch factory\\\\nI1216 12:47:25.518494 5971 ovnkube.go:599] Stopped ovnkube\\\\nI1216 12:47:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:47:27.328600 6181 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:47:27.328858 6181 obj_retry.go:551] Creating *factory.egressNode crc took: 1.819194ms\\\\nI1216 12:47:27.328881 6181 factory.go:1336] Added *v1.Node event handler 7\\\\nI1216 12:47:27.328911 6181 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1216 12:47:27.329146 6181 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1216 12:47:27.329235 6181 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1216 12:47:27.329268 6181 ovnkube.go:599] Stopped ovnkube\\\\nI1216 12:47:27.329294 6181 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1216 12:47:27.329360 6181 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.404525 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8c859be-7650-49fa-a810-1bd096153c33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41e73e0e8e60e28a121a58c2c062fd5a9c0511b830575351f44047d1fe49b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4319cc559caee7402a63a08e2e1136722b2dbec25d2388f64f7f1d55e46dde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8t5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.423637 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.436467 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.440729 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.440767 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.440778 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.440793 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.440803 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:29Z","lastTransitionTime":"2025-12-16T12:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.450836 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.464291 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.478974 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.484753 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs\") pod \"network-metrics-daemon-k6rww\" (UID: \"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\") " pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:29 crc kubenswrapper[4757]: E1216 12:47:29.484879 4757 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 12:47:29 crc kubenswrapper[4757]: E1216 12:47:29.484924 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs podName:0c1b0cca-3853-4bcf-8389-2fa9c754b5e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:30.48491014 +0000 UTC m=+35.912653946 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs") pod "network-metrics-daemon-k6rww" (UID: "0c1b0cca-3853-4bcf-8389-2fa9c754b5e8") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.493322 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.506478 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.519846 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.537164 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.543755 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.543970 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.544087 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.544181 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.544254 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:29Z","lastTransitionTime":"2025-12-16T12:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.550230 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.561926 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.578090 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.602083 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.616302 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.645465 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.646186 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.646223 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.646235 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.646250 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.646260 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:29Z","lastTransitionTime":"2025-12-16T12:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.670178 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.684390 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.697485 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.710348 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.723801 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.737852 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.748920 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.748965 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.748978 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.748996 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.749028 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:29Z","lastTransitionTime":"2025-12-16T12:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.749462 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k6rww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k6rww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.768286 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.784207 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.813929 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:47:27.328600 6181 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:47:27.328858 6181 obj_retry.go:551] Creating *factory.egressNode crc took: 1.819194ms\\\\nI1216 12:47:27.328881 6181 factory.go:1336] Added *v1.Node event handler 7\\\\nI1216 12:47:27.328911 6181 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1216 12:47:27.329146 6181 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1216 12:47:27.329235 6181 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1216 12:47:27.329268 6181 ovnkube.go:599] Stopped ovnkube\\\\nI1216 12:47:27.329294 6181 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1216 12:47:27.329360 6181 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-t465t_openshift-ovn-kubernetes(b876e35b-75f8-407e-bf25-f7b3c2024428)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.826303 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8c859be-7650-49fa-a810-1bd096153c33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41e73e0e8e60e28a121a58c2c062fd5a9c0511b830575351f44047d1fe49b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4319cc559caee7402a63a08e2e1136722b2dbec25d2388f64f7f1d55e46dde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8t5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:29Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.851806 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.851854 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.851870 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.851888 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.851937 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:29Z","lastTransitionTime":"2025-12-16T12:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.954025 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.954055 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.954064 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.954076 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:29 crc kubenswrapper[4757]: I1216 12:47:29.954088 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:29Z","lastTransitionTime":"2025-12-16T12:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.057078 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.057126 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.057138 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.057155 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.057165 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:30Z","lastTransitionTime":"2025-12-16T12:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.159429 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.159925 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.159992 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.160134 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.160219 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:30Z","lastTransitionTime":"2025-12-16T12:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.262492 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.262536 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.262551 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.262567 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.262577 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:30Z","lastTransitionTime":"2025-12-16T12:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.365341 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.365392 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.365408 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.365428 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.365443 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:30Z","lastTransitionTime":"2025-12-16T12:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.467907 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.468200 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.468289 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.468383 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.468466 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:30Z","lastTransitionTime":"2025-12-16T12:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.495645 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs\") pod \"network-metrics-daemon-k6rww\" (UID: \"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\") " pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:30 crc kubenswrapper[4757]: E1216 12:47:30.495764 4757 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 12:47:30 crc kubenswrapper[4757]: E1216 12:47:30.495836 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs podName:0c1b0cca-3853-4bcf-8389-2fa9c754b5e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:32.495820212 +0000 UTC m=+37.923564008 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs") pod "network-metrics-daemon-k6rww" (UID: "0c1b0cca-3853-4bcf-8389-2fa9c754b5e8") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.570724 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.570771 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.570783 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.570800 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.570815 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:30Z","lastTransitionTime":"2025-12-16T12:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.673380 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.673443 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.673463 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.673485 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.673501 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:30Z","lastTransitionTime":"2025-12-16T12:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.776216 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.776280 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.776302 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.776332 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.776357 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:30Z","lastTransitionTime":"2025-12-16T12:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.878077 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.878129 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.878139 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.878152 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.878162 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:30Z","lastTransitionTime":"2025-12-16T12:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.947975 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.948044 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.947975 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:30 crc kubenswrapper[4757]: E1216 12:47:30.948129 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.948168 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:30 crc kubenswrapper[4757]: E1216 12:47:30.948236 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:30 crc kubenswrapper[4757]: E1216 12:47:30.948374 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:30 crc kubenswrapper[4757]: E1216 12:47:30.948517 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.981107 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.981154 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.981165 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.981183 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:30 crc kubenswrapper[4757]: I1216 12:47:30.981194 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:30Z","lastTransitionTime":"2025-12-16T12:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.083842 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.083898 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.083939 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.083959 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.083971 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:31Z","lastTransitionTime":"2025-12-16T12:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.186552 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.186588 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.186597 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.186613 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.186624 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:31Z","lastTransitionTime":"2025-12-16T12:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.288830 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.288890 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.288911 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.289341 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.289391 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:31Z","lastTransitionTime":"2025-12-16T12:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.391986 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.392028 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.392036 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.392048 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.392056 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:31Z","lastTransitionTime":"2025-12-16T12:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.494427 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.494461 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.494471 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.494486 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.494496 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:31Z","lastTransitionTime":"2025-12-16T12:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.596471 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.596512 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.596522 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.596535 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.596545 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:31Z","lastTransitionTime":"2025-12-16T12:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.698734 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.698776 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.698787 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.698804 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.698816 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:31Z","lastTransitionTime":"2025-12-16T12:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.800501 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.800533 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.800541 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.800555 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.800565 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:31Z","lastTransitionTime":"2025-12-16T12:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.903904 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.903954 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.903967 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.903984 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:31 crc kubenswrapper[4757]: I1216 12:47:31.903998 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:31Z","lastTransitionTime":"2025-12-16T12:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.007238 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.007320 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.007343 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.007367 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.007384 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:32Z","lastTransitionTime":"2025-12-16T12:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.110790 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.110844 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.110853 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.110865 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.110874 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:32Z","lastTransitionTime":"2025-12-16T12:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.213704 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.213753 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.213764 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.213778 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.213789 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:32Z","lastTransitionTime":"2025-12-16T12:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.315936 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.315983 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.315994 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.316030 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.316045 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:32Z","lastTransitionTime":"2025-12-16T12:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.418313 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.418380 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.418393 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.418411 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.418464 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:32Z","lastTransitionTime":"2025-12-16T12:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.515292 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs\") pod \"network-metrics-daemon-k6rww\" (UID: \"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\") " pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:32 crc kubenswrapper[4757]: E1216 12:47:32.515512 4757 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 12:47:32 crc kubenswrapper[4757]: E1216 12:47:32.515605 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs podName:0c1b0cca-3853-4bcf-8389-2fa9c754b5e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:36.515584264 +0000 UTC m=+41.943328130 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs") pod "network-metrics-daemon-k6rww" (UID: "0c1b0cca-3853-4bcf-8389-2fa9c754b5e8") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.521678 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.521733 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.521746 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.521762 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.521772 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:32Z","lastTransitionTime":"2025-12-16T12:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.708950 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.709032 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.709045 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.709089 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.709099 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:32Z","lastTransitionTime":"2025-12-16T12:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.811760 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.811811 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.811820 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.811832 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.811842 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:32Z","lastTransitionTime":"2025-12-16T12:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.914058 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.914108 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.914121 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.914137 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.914148 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:32Z","lastTransitionTime":"2025-12-16T12:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.948708 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:32 crc kubenswrapper[4757]: E1216 12:47:32.948848 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.949206 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.949236 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:32 crc kubenswrapper[4757]: E1216 12:47:32.949281 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:32 crc kubenswrapper[4757]: I1216 12:47:32.949326 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:32 crc kubenswrapper[4757]: E1216 12:47:32.949438 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:32 crc kubenswrapper[4757]: E1216 12:47:32.949626 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.016705 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.016752 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.016763 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.016780 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.016793 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:33Z","lastTransitionTime":"2025-12-16T12:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.119469 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.119692 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.119700 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.119714 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.119723 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:33Z","lastTransitionTime":"2025-12-16T12:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.223264 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.223310 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.223326 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.223346 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.223358 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:33Z","lastTransitionTime":"2025-12-16T12:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.325424 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.325467 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.325479 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.325495 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.325507 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:33Z","lastTransitionTime":"2025-12-16T12:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.428339 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.428385 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.428394 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.428409 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.428418 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:33Z","lastTransitionTime":"2025-12-16T12:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.530845 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.530909 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.530922 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.530938 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.530950 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:33Z","lastTransitionTime":"2025-12-16T12:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.632761 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.632787 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.632794 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.632805 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.632813 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:33Z","lastTransitionTime":"2025-12-16T12:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.735932 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.735976 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.735986 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.736000 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.736041 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:33Z","lastTransitionTime":"2025-12-16T12:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.837707 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.837746 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.837756 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.837771 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.837783 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:33Z","lastTransitionTime":"2025-12-16T12:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.939548 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.939588 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.939600 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.939615 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:33 crc kubenswrapper[4757]: I1216 12:47:33.939628 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:33Z","lastTransitionTime":"2025-12-16T12:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.041927 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.041966 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.041978 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.041992 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.042028 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:34Z","lastTransitionTime":"2025-12-16T12:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.145244 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.145314 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.145335 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.145365 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.145392 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:34Z","lastTransitionTime":"2025-12-16T12:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.248145 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.248193 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.248204 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.248219 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.248230 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:34Z","lastTransitionTime":"2025-12-16T12:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.350709 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.350749 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.350759 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.350773 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.350782 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:34Z","lastTransitionTime":"2025-12-16T12:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.452943 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.452995 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.453035 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.453057 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.453071 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:34Z","lastTransitionTime":"2025-12-16T12:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.555242 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.555281 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.555292 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.555306 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.555316 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:34Z","lastTransitionTime":"2025-12-16T12:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.657348 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.657385 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.657395 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.657409 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.657419 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:34Z","lastTransitionTime":"2025-12-16T12:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.759592 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.759637 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.759648 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.759663 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.759675 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:34Z","lastTransitionTime":"2025-12-16T12:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.862865 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.862898 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.862906 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.862920 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.862929 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:34Z","lastTransitionTime":"2025-12-16T12:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.948651 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:34 crc kubenswrapper[4757]: E1216 12:47:34.948772 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.949080 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.949179 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:34 crc kubenswrapper[4757]: E1216 12:47:34.949285 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.949373 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:34 crc kubenswrapper[4757]: E1216 12:47:34.949488 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:34 crc kubenswrapper[4757]: E1216 12:47:34.949604 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.965495 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.965569 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.965597 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.965612 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.965623 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:34Z","lastTransitionTime":"2025-12-16T12:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.972202 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:47:27.328600 6181 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:47:27.328858 6181 obj_retry.go:551] Creating *factory.egressNode crc took: 1.819194ms\\\\nI1216 12:47:27.328881 6181 factory.go:1336] Added *v1.Node event handler 7\\\\nI1216 12:47:27.328911 6181 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1216 12:47:27.329146 6181 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1216 12:47:27.329235 6181 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1216 12:47:27.329268 6181 ovnkube.go:599] Stopped ovnkube\\\\nI1216 12:47:27.329294 6181 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1216 12:47:27.329360 6181 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-t465t_openshift-ovn-kubernetes(b876e35b-75f8-407e-bf25-f7b3c2024428)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:34Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:34 crc kubenswrapper[4757]: I1216 12:47:34.983628 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8c859be-7650-49fa-a810-1bd096153c33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41e73e0e8e60e28a121a58c2c062fd5a9c0511b830575351f44047d1fe49b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4319cc559caee7402a63a08e2e1136722b2dbec25d2388f64f7f1d55e46dde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8t5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:34Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.003595 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:35Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.018106 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:35Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.038586 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:35Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.050461 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:35Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.068408 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.068434 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.068443 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.068455 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.068464 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:35Z","lastTransitionTime":"2025-12-16T12:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.068517 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:35Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.081560 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:35Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.093875 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:35Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.104535 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:35Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.117149 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:35Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.134274 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:35Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.147497 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k6rww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k6rww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:35Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.164431 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:35Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.170726 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.170793 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.170807 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.170827 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.170839 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:35Z","lastTransitionTime":"2025-12-16T12:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.181559 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:35Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.195892 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:35Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.208687 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:35Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.274015 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.274057 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.274066 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.274084 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.274097 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:35Z","lastTransitionTime":"2025-12-16T12:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.376809 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.376838 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.376847 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.376861 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.376872 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:35Z","lastTransitionTime":"2025-12-16T12:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.479562 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.479599 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.479609 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.479622 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.479631 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:35Z","lastTransitionTime":"2025-12-16T12:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.582782 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.582859 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.582871 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.582889 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.582926 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:35Z","lastTransitionTime":"2025-12-16T12:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.685861 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.685912 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.685924 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.685940 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.685951 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:35Z","lastTransitionTime":"2025-12-16T12:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.788144 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.788218 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.788233 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.788246 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.788255 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:35Z","lastTransitionTime":"2025-12-16T12:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.891151 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.891220 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.891232 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.891250 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.891261 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:35Z","lastTransitionTime":"2025-12-16T12:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.993410 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.993439 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.993448 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.993461 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:35 crc kubenswrapper[4757]: I1216 12:47:35.993470 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:35Z","lastTransitionTime":"2025-12-16T12:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.096126 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.096159 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.096184 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.096198 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.096208 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:36Z","lastTransitionTime":"2025-12-16T12:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.187507 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.187554 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.187564 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.187581 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.187591 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:36Z","lastTransitionTime":"2025-12-16T12:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:36 crc kubenswrapper[4757]: E1216 12:47:36.200591 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:36Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.204185 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.204235 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.204247 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.204263 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.204276 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:36Z","lastTransitionTime":"2025-12-16T12:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:36 crc kubenswrapper[4757]: E1216 12:47:36.216268 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:36Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.219463 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.219505 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.219514 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.219530 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.219540 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:36Z","lastTransitionTime":"2025-12-16T12:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:36 crc kubenswrapper[4757]: E1216 12:47:36.231262 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:36Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.239549 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.239595 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.239606 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.239626 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.239637 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:36Z","lastTransitionTime":"2025-12-16T12:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:36 crc kubenswrapper[4757]: E1216 12:47:36.254434 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:36Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.258972 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.259049 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.259060 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.259078 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.259092 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:36Z","lastTransitionTime":"2025-12-16T12:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:36 crc kubenswrapper[4757]: E1216 12:47:36.273514 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:36Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:36 crc kubenswrapper[4757]: E1216 12:47:36.273687 4757 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.275960 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.276185 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.276219 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.276240 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.276502 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:36Z","lastTransitionTime":"2025-12-16T12:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.379428 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.379463 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.379472 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.379484 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.379495 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:36Z","lastTransitionTime":"2025-12-16T12:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.482127 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.482218 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.482230 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.482248 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.482260 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:36Z","lastTransitionTime":"2025-12-16T12:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.563147 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs\") pod \"network-metrics-daemon-k6rww\" (UID: \"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\") " pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:36 crc kubenswrapper[4757]: E1216 12:47:36.563335 4757 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 12:47:36 crc kubenswrapper[4757]: E1216 12:47:36.563389 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs podName:0c1b0cca-3853-4bcf-8389-2fa9c754b5e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:47:44.56337296 +0000 UTC m=+49.991116756 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs") pod "network-metrics-daemon-k6rww" (UID: "0c1b0cca-3853-4bcf-8389-2fa9c754b5e8") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.584796 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.584823 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.584830 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.584842 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.584850 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:36Z","lastTransitionTime":"2025-12-16T12:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.687302 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.687344 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.687354 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.687366 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.687377 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:36Z","lastTransitionTime":"2025-12-16T12:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.789334 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.789400 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.789413 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.789428 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.789439 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:36Z","lastTransitionTime":"2025-12-16T12:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.891704 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.891750 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.891762 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.891779 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.891789 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:36Z","lastTransitionTime":"2025-12-16T12:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.948366 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.948394 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.948437 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.948450 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:36 crc kubenswrapper[4757]: E1216 12:47:36.948645 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:36 crc kubenswrapper[4757]: E1216 12:47:36.948730 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:36 crc kubenswrapper[4757]: E1216 12:47:36.948795 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:47:36 crc kubenswrapper[4757]: E1216 12:47:36.948862 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.995365 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.995398 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.995407 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.995420 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:36 crc kubenswrapper[4757]: I1216 12:47:36.995430 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:36Z","lastTransitionTime":"2025-12-16T12:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.097915 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.097964 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.097975 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.097994 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.098021 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:37Z","lastTransitionTime":"2025-12-16T12:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.200455 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.200501 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.200513 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.200534 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.200545 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:37Z","lastTransitionTime":"2025-12-16T12:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.302985 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.303085 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.303108 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.303138 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.303160 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:37Z","lastTransitionTime":"2025-12-16T12:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.405354 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.405402 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.405415 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.405431 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.405442 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:37Z","lastTransitionTime":"2025-12-16T12:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.507605 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.507644 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.507652 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.507667 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.507694 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:37Z","lastTransitionTime":"2025-12-16T12:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.609798 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.609837 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.609849 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.609862 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.609872 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:37Z","lastTransitionTime":"2025-12-16T12:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.712846 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.712895 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.712907 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.712924 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.712934 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:37Z","lastTransitionTime":"2025-12-16T12:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.815376 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.815408 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.815417 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.815431 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.815441 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:37Z","lastTransitionTime":"2025-12-16T12:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.918209 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.918241 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.918253 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.918269 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:37 crc kubenswrapper[4757]: I1216 12:47:37.918279 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:37Z","lastTransitionTime":"2025-12-16T12:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.019939 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.019981 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.019993 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.020020 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.020033 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:38Z","lastTransitionTime":"2025-12-16T12:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.122600 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.122639 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.122648 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.122659 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.122668 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:38Z","lastTransitionTime":"2025-12-16T12:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.225863 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.225908 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.225924 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.225942 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.225954 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:38Z","lastTransitionTime":"2025-12-16T12:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.328132 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.328184 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.328201 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.328216 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.328234 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:38Z","lastTransitionTime":"2025-12-16T12:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.430354 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.430417 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.430433 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.430457 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.430474 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:38Z","lastTransitionTime":"2025-12-16T12:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.533666 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.533716 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.533727 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.533742 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.533753 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:38Z","lastTransitionTime":"2025-12-16T12:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.636309 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.636550 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.636668 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.636754 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.636838 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:38Z","lastTransitionTime":"2025-12-16T12:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.739457 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.739496 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.739506 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.739520 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.739528 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:38Z","lastTransitionTime":"2025-12-16T12:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.841827 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.841858 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.841869 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.841883 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.841896 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:38Z","lastTransitionTime":"2025-12-16T12:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.943758 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.943805 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.943818 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.943838 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.943852 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:38Z","lastTransitionTime":"2025-12-16T12:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.948095 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:38 crc kubenswrapper[4757]: E1216 12:47:38.948226 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.948234 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.948268 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:38 crc kubenswrapper[4757]: E1216 12:47:38.948455 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:38 crc kubenswrapper[4757]: E1216 12:47:38.948554 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:38 crc kubenswrapper[4757]: I1216 12:47:38.948671 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:38 crc kubenswrapper[4757]: E1216 12:47:38.948880 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.046489 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.046524 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.046533 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.046545 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.046555 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:39Z","lastTransitionTime":"2025-12-16T12:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.148910 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.149198 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.149269 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.149333 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.149411 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:39Z","lastTransitionTime":"2025-12-16T12:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.251357 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.251621 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.251689 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.251765 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.251845 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:39Z","lastTransitionTime":"2025-12-16T12:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.353825 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.353860 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.353869 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.353883 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.353893 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:39Z","lastTransitionTime":"2025-12-16T12:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.456056 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.456095 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.456109 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.456134 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.456144 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:39Z","lastTransitionTime":"2025-12-16T12:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.559319 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.559368 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.559382 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.559399 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.559410 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:39Z","lastTransitionTime":"2025-12-16T12:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.661481 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.661517 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.661526 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.661543 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.661553 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:39Z","lastTransitionTime":"2025-12-16T12:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.764364 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.764475 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.764486 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.764498 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.764507 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:39Z","lastTransitionTime":"2025-12-16T12:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.867559 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.867622 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.867639 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.867664 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.867685 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:39Z","lastTransitionTime":"2025-12-16T12:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.970743 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.970798 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.970814 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.970832 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:39 crc kubenswrapper[4757]: I1216 12:47:39.970843 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:39Z","lastTransitionTime":"2025-12-16T12:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.073068 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.073120 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.073131 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.073149 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.073161 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:40Z","lastTransitionTime":"2025-12-16T12:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.176264 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.176327 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.176341 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.176359 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.176371 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:40Z","lastTransitionTime":"2025-12-16T12:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.278604 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.278648 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.278657 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.278674 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.278685 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:40Z","lastTransitionTime":"2025-12-16T12:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.380383 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.380432 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.380441 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.380457 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.380466 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:40Z","lastTransitionTime":"2025-12-16T12:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.483154 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.483195 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.483207 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.483222 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.483234 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:40Z","lastTransitionTime":"2025-12-16T12:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.585595 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.585655 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.585669 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.585757 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.585772 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:40Z","lastTransitionTime":"2025-12-16T12:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.688662 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.688717 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.688728 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.688745 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.688758 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:40Z","lastTransitionTime":"2025-12-16T12:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.791099 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.791155 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.791168 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.791185 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.791197 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:40Z","lastTransitionTime":"2025-12-16T12:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.893652 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.893692 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.893701 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.893715 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.893724 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:40Z","lastTransitionTime":"2025-12-16T12:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.948481 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.948625 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.948697 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:40 crc kubenswrapper[4757]: E1216 12:47:40.948711 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:47:40 crc kubenswrapper[4757]: E1216 12:47:40.948779 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.948804 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:40 crc kubenswrapper[4757]: E1216 12:47:40.948909 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:40 crc kubenswrapper[4757]: E1216 12:47:40.949044 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.995155 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.995184 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.995192 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.995203 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:40 crc kubenswrapper[4757]: I1216 12:47:40.995212 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:40Z","lastTransitionTime":"2025-12-16T12:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.098435 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.098552 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.098576 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.098654 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.098674 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:41Z","lastTransitionTime":"2025-12-16T12:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.201396 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.201666 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.201743 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.201843 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.201922 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:41Z","lastTransitionTime":"2025-12-16T12:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.303823 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.303858 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.303867 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.303879 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.303895 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:41Z","lastTransitionTime":"2025-12-16T12:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.406463 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.406509 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.406517 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.406531 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.406541 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:41Z","lastTransitionTime":"2025-12-16T12:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.509114 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.509472 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.509566 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.509658 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.509733 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:41Z","lastTransitionTime":"2025-12-16T12:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.611831 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.611875 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.611884 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.611897 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.611907 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:41Z","lastTransitionTime":"2025-12-16T12:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.714336 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.714630 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.714756 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.714865 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.714959 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:41Z","lastTransitionTime":"2025-12-16T12:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.817635 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.817683 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.817697 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.817715 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.817728 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:41Z","lastTransitionTime":"2025-12-16T12:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.920391 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.920443 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.920453 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.920543 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.920562 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:41Z","lastTransitionTime":"2025-12-16T12:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:41 crc kubenswrapper[4757]: I1216 12:47:41.948873 4757 scope.go:117] "RemoveContainer" containerID="043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.023346 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.023726 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.023832 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.023945 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.024078 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:42Z","lastTransitionTime":"2025-12-16T12:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.128196 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.128223 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.128231 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.128243 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.128254 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:42Z","lastTransitionTime":"2025-12-16T12:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.230871 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.230919 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.230933 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.230951 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.230962 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:42Z","lastTransitionTime":"2025-12-16T12:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.333286 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.333325 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.333334 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.333351 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.333361 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:42Z","lastTransitionTime":"2025-12-16T12:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.349754 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovnkube-controller/1.log" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.352106 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerStarted","Data":"9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff"} Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.353132 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.366138 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8c859be-7650-49fa-a810-1bd096153c33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41e73e0e8e60e28a121a58c2c062fd5a9c0511b830575351f44047d1fe49b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4319cc559caee7402a63a08e2e1136722b2dbec25d2388f64f7f1d55e46dde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8t5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.386332 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.398432 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.420564 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:47:27.328600 6181 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:47:27.328858 6181 obj_retry.go:551] Creating *factory.egressNode crc took: 1.819194ms\\\\nI1216 12:47:27.328881 6181 factory.go:1336] Added *v1.Node event handler 7\\\\nI1216 12:47:27.328911 6181 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1216 12:47:27.329146 6181 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1216 12:47:27.329235 6181 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1216 12:47:27.329268 6181 ovnkube.go:599] Stopped ovnkube\\\\nI1216 12:47:27.329294 6181 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1216 12:47:27.329360 6181 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.433492 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.435385 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.435430 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.435442 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.435460 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.435471 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:42Z","lastTransitionTime":"2025-12-16T12:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.448266 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.464697 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.484111 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.506379 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.518873 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.532493 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.540848 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.540892 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.540902 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.540919 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.540930 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:42Z","lastTransitionTime":"2025-12-16T12:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.552502 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.566088 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.580109 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.592742 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.605810 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.615116 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k6rww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k6rww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.643042 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.643085 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.643094 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.643108 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.643119 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:42Z","lastTransitionTime":"2025-12-16T12:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.744910 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.744947 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.744956 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.744969 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.744981 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:42Z","lastTransitionTime":"2025-12-16T12:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.846503 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.846546 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.846557 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.846572 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.846584 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:42Z","lastTransitionTime":"2025-12-16T12:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.948715 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.948722 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.948773 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.948801 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.948814 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.948773 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.948828 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.948838 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:42Z","lastTransitionTime":"2025-12-16T12:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:42 crc kubenswrapper[4757]: I1216 12:47:42.949248 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:42 crc kubenswrapper[4757]: E1216 12:47:42.949370 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:42 crc kubenswrapper[4757]: E1216 12:47:42.949445 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:42 crc kubenswrapper[4757]: E1216 12:47:42.949517 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:47:42 crc kubenswrapper[4757]: E1216 12:47:42.949577 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.051889 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.052243 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.052443 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.052615 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.052836 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:43Z","lastTransitionTime":"2025-12-16T12:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.156061 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.156094 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.156102 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.156115 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.156123 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:43Z","lastTransitionTime":"2025-12-16T12:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.258806 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.258877 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.258888 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.258903 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.258912 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:43Z","lastTransitionTime":"2025-12-16T12:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.355883 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovnkube-controller/2.log" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.356693 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovnkube-controller/1.log" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.359039 4757 generic.go:334] "Generic (PLEG): container finished" podID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerID="9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff" exitCode=1 Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.359069 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerDied","Data":"9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff"} Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.359098 4757 scope.go:117] "RemoveContainer" containerID="043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.359715 4757 scope.go:117] "RemoveContainer" containerID="9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff" Dec 16 12:47:43 crc kubenswrapper[4757]: E1216 12:47:43.359890 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t465t_openshift-ovn-kubernetes(b876e35b-75f8-407e-bf25-f7b3c2024428)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.360143 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.360161 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.360171 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.360199 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.360236 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:43Z","lastTransitionTime":"2025-12-16T12:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.379476 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:43Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.391618 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:43Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.406476 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:43Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.419370 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:43Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.430573 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k6rww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k6rww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:43Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.442928 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8c859be-7650-49fa-a810-1bd096153c33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41e73e0e8e60e28a121a58c2c062fd5a9c0511b830575351f44047d1fe49b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4319cc559caee7402a63a08e2e1136722b2dbec25d2388f64f7f1d55e46dde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8t5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:43Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.462168 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.462204 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.462213 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.462227 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.462235 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:43Z","lastTransitionTime":"2025-12-16T12:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.463730 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:43Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.479356 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:43Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.501226 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://043a4bf0d3a5c56f190fac35235bf35ea0aa8f4136450f22fa5ec78579e838ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:47:27.328600 6181 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:47:27.328858 6181 obj_retry.go:551] Creating *factory.egressNode crc took: 1.819194ms\\\\nI1216 12:47:27.328881 6181 factory.go:1336] Added *v1.Node event handler 7\\\\nI1216 12:47:27.328911 6181 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1216 12:47:27.329146 6181 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1216 12:47:27.329235 6181 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1216 12:47:27.329268 6181 ovnkube.go:599] Stopped ovnkube\\\\nI1216 12:47:27.329294 6181 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1216 12:47:27.329360 6181 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:43Z\\\",\\\"message\\\":\\\"services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1216 12:47:42.718187 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z]\\\\nI1216 12:47:42.718168 6376 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:43Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.514277 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:43Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.530944 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:43Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.548897 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:43Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.562615 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:43Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.564135 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.564178 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.564191 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.564206 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.564218 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:43Z","lastTransitionTime":"2025-12-16T12:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.576712 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:43Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.588699 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:43Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.605726 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:43Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.620798 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:43Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.666701 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.666742 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.666752 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.666772 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.666784 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:43Z","lastTransitionTime":"2025-12-16T12:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.769063 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.769493 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.769594 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.769671 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.769766 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:43Z","lastTransitionTime":"2025-12-16T12:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.876101 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.876144 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.876155 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.876168 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.876179 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:43Z","lastTransitionTime":"2025-12-16T12:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.978900 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.978943 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.978954 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.978969 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:43 crc kubenswrapper[4757]: I1216 12:47:43.978980 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:43Z","lastTransitionTime":"2025-12-16T12:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.081225 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.081260 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.081268 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.081281 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.081290 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:44Z","lastTransitionTime":"2025-12-16T12:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.184043 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.184096 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.184107 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.184122 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.184132 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:44Z","lastTransitionTime":"2025-12-16T12:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.286573 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.286640 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.286652 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.286681 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.286696 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:44Z","lastTransitionTime":"2025-12-16T12:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.364688 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovnkube-controller/2.log" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.368922 4757 scope.go:117] "RemoveContainer" containerID="9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff" Dec 16 12:47:44 crc kubenswrapper[4757]: E1216 12:47:44.369103 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t465t_openshift-ovn-kubernetes(b876e35b-75f8-407e-bf25-f7b3c2024428)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.389384 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.389427 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.389436 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.389452 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.389461 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:44Z","lastTransitionTime":"2025-12-16T12:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.393734 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.411844 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.430848 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:43Z\\\",\\\"message\\\":\\\"services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1216 12:47:42.718187 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z]\\\\nI1216 12:47:42.718168 6376 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t465t_openshift-ovn-kubernetes(b876e35b-75f8-407e-bf25-f7b3c2024428)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.447878 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8c859be-7650-49fa-a810-1bd096153c33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41e73e0e8e60e28a121a58c2c062fd5a9c0511b830575351f44047d1fe49b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4319cc559caee7402a63a08e2e1136722b2dbec25d2388f64f7f1d55e46dde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8t5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.462650 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.476759 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.488886 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.491569 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.491600 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.491611 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.491628 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.491639 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:44Z","lastTransitionTime":"2025-12-16T12:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.500404 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.514333 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.526854 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.545539 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.560180 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.571344 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.583664 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.593328 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.593356 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.593366 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.593378 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.593388 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:44Z","lastTransitionTime":"2025-12-16T12:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.595734 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.607707 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.618769 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k6rww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k6rww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.663473 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs\") pod \"network-metrics-daemon-k6rww\" (UID: \"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\") " pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:44 crc kubenswrapper[4757]: E1216 12:47:44.663599 4757 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 12:47:44 crc kubenswrapper[4757]: E1216 12:47:44.663655 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs podName:0c1b0cca-3853-4bcf-8389-2fa9c754b5e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:48:00.663639506 +0000 UTC m=+66.091383302 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs") pod "network-metrics-daemon-k6rww" (UID: "0c1b0cca-3853-4bcf-8389-2fa9c754b5e8") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.696303 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.696337 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.696346 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.696359 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.696369 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:44Z","lastTransitionTime":"2025-12-16T12:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.799352 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.799407 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.799419 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.799435 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.799449 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:44Z","lastTransitionTime":"2025-12-16T12:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.901862 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.901893 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.901902 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.901917 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.901926 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:44Z","lastTransitionTime":"2025-12-16T12:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.948635 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:44 crc kubenswrapper[4757]: E1216 12:47:44.948797 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.949200 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.949263 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:44 crc kubenswrapper[4757]: E1216 12:47:44.949366 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.949436 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:44 crc kubenswrapper[4757]: E1216 12:47:44.949502 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:44 crc kubenswrapper[4757]: E1216 12:47:44.949565 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.966112 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.966264 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:44 crc kubenswrapper[4757]: E1216 12:47:44.966324 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:48:16.966301041 +0000 UTC m=+82.394044877 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:47:44 crc kubenswrapper[4757]: E1216 12:47:44.966355 4757 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 12:47:44 crc kubenswrapper[4757]: E1216 12:47:44.966404 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:48:16.966390223 +0000 UTC m=+82.394134019 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.966457 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:44 crc kubenswrapper[4757]: E1216 12:47:44.966583 4757 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 12:47:44 crc kubenswrapper[4757]: E1216 12:47:44.966663 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:48:16.966642509 +0000 UTC m=+82.394386365 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.969900 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.983604 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:44 crc kubenswrapper[4757]: I1216 12:47:44.995371 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:44Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.003931 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.003957 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.003964 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.003991 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.004023 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:45Z","lastTransitionTime":"2025-12-16T12:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.008585 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.023974 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.040660 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k6rww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k6rww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.052815 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.068036 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.068100 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:45 crc kubenswrapper[4757]: E1216 12:47:45.068281 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 12:47:45 crc kubenswrapper[4757]: E1216 12:47:45.068305 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 12:47:45 crc kubenswrapper[4757]: E1216 12:47:45.068318 4757 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:45 crc kubenswrapper[4757]: E1216 12:47:45.068390 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 12:48:17.068374628 +0000 UTC m=+82.496118424 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:45 crc kubenswrapper[4757]: E1216 12:47:45.068322 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 12:47:45 crc kubenswrapper[4757]: E1216 12:47:45.068472 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 12:47:45 crc kubenswrapper[4757]: E1216 12:47:45.068488 4757 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:45 crc kubenswrapper[4757]: E1216 12:47:45.068551 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 12:48:17.068531922 +0000 UTC m=+82.496275708 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.069201 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.083764 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.102995 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:43Z\\\",\\\"message\\\":\\\"services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1216 12:47:42.718187 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z]\\\\nI1216 12:47:42.718168 6376 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t465t_openshift-ovn-kubernetes(b876e35b-75f8-407e-bf25-f7b3c2024428)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.106496 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.106539 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.106548 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.106563 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.106574 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:45Z","lastTransitionTime":"2025-12-16T12:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.115305 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8c859be-7650-49fa-a810-1bd096153c33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41e73e0e8e60e28a121a58c2c062fd5a9c0511b830575351f44047d1fe49b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4319cc559caee7402a63a08e2e1136722b2dbec25d2388f64f7f1d55e46dde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8t5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.127092 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.137420 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.147095 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.155748 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.165647 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.178356 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.209461 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.209490 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.209498 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.209510 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.209520 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:45Z","lastTransitionTime":"2025-12-16T12:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.311941 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.311977 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.311986 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.311999 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.312022 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:45Z","lastTransitionTime":"2025-12-16T12:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.413575 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.413847 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.413957 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.414057 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.414140 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:45Z","lastTransitionTime":"2025-12-16T12:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.517556 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.517592 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.517603 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.517616 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.517625 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:45Z","lastTransitionTime":"2025-12-16T12:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.619599 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.619859 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.619969 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.620096 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.620224 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:45Z","lastTransitionTime":"2025-12-16T12:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.695796 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.706614 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.710835 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.722117 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.722154 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.722162 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.722176 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.722188 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:45Z","lastTransitionTime":"2025-12-16T12:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.725902 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.737476 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.749568 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.762969 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.775933 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.789594 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k6rww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k6rww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.806205 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.821661 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.824276 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.824412 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.824528 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.824611 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.824709 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:45Z","lastTransitionTime":"2025-12-16T12:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.841809 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:43Z\\\",\\\"message\\\":\\\"services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1216 12:47:42.718187 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z]\\\\nI1216 12:47:42.718168 6376 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t465t_openshift-ovn-kubernetes(b876e35b-75f8-407e-bf25-f7b3c2024428)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.855572 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8c859be-7650-49fa-a810-1bd096153c33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41e73e0e8e60e28a121a58c2c062fd5a9c0511b830575351f44047d1fe49b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4319cc559caee7402a63a08e2e1136722b2dbec25d2388f64f7f1d55e46dde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8t5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.868991 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.885812 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.898597 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.910781 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.920549 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.926767 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.926805 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.926816 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.926850 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.926863 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:45Z","lastTransitionTime":"2025-12-16T12:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:45 crc kubenswrapper[4757]: I1216 12:47:45.930604 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:45Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.029239 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.029303 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.029315 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.029332 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.029342 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:46Z","lastTransitionTime":"2025-12-16T12:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.131844 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.132085 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.132174 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.132253 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.132326 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:46Z","lastTransitionTime":"2025-12-16T12:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.234387 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.234428 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.234438 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.234455 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.234466 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:46Z","lastTransitionTime":"2025-12-16T12:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.335937 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.335975 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.335983 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.335995 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.336022 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:46Z","lastTransitionTime":"2025-12-16T12:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.438274 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.438333 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.438353 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.438380 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.438400 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:46Z","lastTransitionTime":"2025-12-16T12:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.491923 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.491995 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.492061 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.492085 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.492102 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:46Z","lastTransitionTime":"2025-12-16T12:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:46 crc kubenswrapper[4757]: E1216 12:47:46.506410 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:46Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.510650 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.510784 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.510878 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.510970 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.511107 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:46Z","lastTransitionTime":"2025-12-16T12:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:46 crc kubenswrapper[4757]: E1216 12:47:46.526338 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:46Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.531419 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.531476 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.531493 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.531513 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.531528 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:46Z","lastTransitionTime":"2025-12-16T12:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:46 crc kubenswrapper[4757]: E1216 12:47:46.544131 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:46Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.547551 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.547659 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.547731 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.547792 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.547860 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:46Z","lastTransitionTime":"2025-12-16T12:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:46 crc kubenswrapper[4757]: E1216 12:47:46.560255 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:46Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.564474 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.564663 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.564744 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.564836 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.564920 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:46Z","lastTransitionTime":"2025-12-16T12:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:46 crc kubenswrapper[4757]: E1216 12:47:46.577920 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:46Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:46 crc kubenswrapper[4757]: E1216 12:47:46.578109 4757 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.579641 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.579681 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.579693 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.579710 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.579723 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:46Z","lastTransitionTime":"2025-12-16T12:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.681837 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.681876 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.681889 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.681919 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.681934 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:46Z","lastTransitionTime":"2025-12-16T12:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.785032 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.785086 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.785096 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.785111 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.785121 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:46Z","lastTransitionTime":"2025-12-16T12:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.887298 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.887375 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.887395 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.887418 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.887436 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:46Z","lastTransitionTime":"2025-12-16T12:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.948809 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.948860 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.948933 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:46 crc kubenswrapper[4757]: E1216 12:47:46.949439 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.949073 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:46 crc kubenswrapper[4757]: E1216 12:47:46.949225 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:46 crc kubenswrapper[4757]: E1216 12:47:46.949571 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:47:46 crc kubenswrapper[4757]: E1216 12:47:46.949746 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.990471 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.990518 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.990529 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.990546 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:46 crc kubenswrapper[4757]: I1216 12:47:46.990559 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:46Z","lastTransitionTime":"2025-12-16T12:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.092492 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.092560 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.092594 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.092610 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.092621 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:47Z","lastTransitionTime":"2025-12-16T12:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.195086 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.195158 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.195181 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.195209 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.195231 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:47Z","lastTransitionTime":"2025-12-16T12:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.297746 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.297787 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.297795 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.297809 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.297821 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:47Z","lastTransitionTime":"2025-12-16T12:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.399837 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.399876 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.399889 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.399904 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.399914 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:47Z","lastTransitionTime":"2025-12-16T12:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.502187 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.502217 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.502227 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.502239 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.502248 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:47Z","lastTransitionTime":"2025-12-16T12:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.605788 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.605852 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.605868 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.605891 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.605907 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:47Z","lastTransitionTime":"2025-12-16T12:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.709226 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.709271 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.709284 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.709299 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.709310 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:47Z","lastTransitionTime":"2025-12-16T12:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.812190 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.812237 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.812248 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.812264 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.812275 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:47Z","lastTransitionTime":"2025-12-16T12:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.914926 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.914987 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.915039 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.915240 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:47 crc kubenswrapper[4757]: I1216 12:47:47.915262 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:47Z","lastTransitionTime":"2025-12-16T12:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.017789 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.018015 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.018092 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.018186 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.018278 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:48Z","lastTransitionTime":"2025-12-16T12:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.120641 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.120933 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.121093 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.121194 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.121275 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:48Z","lastTransitionTime":"2025-12-16T12:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.223583 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.224081 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.224249 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.224386 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.224525 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:48Z","lastTransitionTime":"2025-12-16T12:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.327537 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.327575 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.327586 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.327601 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.327611 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:48Z","lastTransitionTime":"2025-12-16T12:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.430248 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.430556 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.430690 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.430855 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.430994 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:48Z","lastTransitionTime":"2025-12-16T12:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.533621 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.533930 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.534050 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.534145 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.534247 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:48Z","lastTransitionTime":"2025-12-16T12:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.637311 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.637856 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.637953 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.638054 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.638148 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:48Z","lastTransitionTime":"2025-12-16T12:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.740848 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.740889 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.740897 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.740911 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.740922 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:48Z","lastTransitionTime":"2025-12-16T12:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.843250 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.843287 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.843297 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.843312 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.843321 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:48Z","lastTransitionTime":"2025-12-16T12:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.945607 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.945857 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.945960 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.946060 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.946287 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:48Z","lastTransitionTime":"2025-12-16T12:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.947928 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:48 crc kubenswrapper[4757]: E1216 12:47:48.948061 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.948195 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:48 crc kubenswrapper[4757]: E1216 12:47:48.948255 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.948391 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:48 crc kubenswrapper[4757]: E1216 12:47:48.948459 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:47:48 crc kubenswrapper[4757]: I1216 12:47:48.949887 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:48 crc kubenswrapper[4757]: E1216 12:47:48.949987 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.048686 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.048731 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.048741 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.048755 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.048766 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:49Z","lastTransitionTime":"2025-12-16T12:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.151177 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.151218 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.151226 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.151241 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.151252 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:49Z","lastTransitionTime":"2025-12-16T12:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.254104 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.254150 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.254161 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.254176 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.254187 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:49Z","lastTransitionTime":"2025-12-16T12:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.356485 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.356708 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.356805 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.356885 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.356967 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:49Z","lastTransitionTime":"2025-12-16T12:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.459539 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.459965 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.460162 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.460316 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.460444 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:49Z","lastTransitionTime":"2025-12-16T12:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.563755 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.563813 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.563829 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.563855 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.563877 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:49Z","lastTransitionTime":"2025-12-16T12:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.665992 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.666053 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.666068 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.666090 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.666112 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:49Z","lastTransitionTime":"2025-12-16T12:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.768738 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.768800 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.768809 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.768820 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.768829 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:49Z","lastTransitionTime":"2025-12-16T12:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.871331 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.871373 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.871383 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.871397 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.871407 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:49Z","lastTransitionTime":"2025-12-16T12:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.974157 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.975222 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.975277 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.975302 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:49 crc kubenswrapper[4757]: I1216 12:47:49.975317 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:49Z","lastTransitionTime":"2025-12-16T12:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.077601 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.077639 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.077650 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.077667 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.077680 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:50Z","lastTransitionTime":"2025-12-16T12:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.179825 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.179860 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.179869 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.179883 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.179893 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:50Z","lastTransitionTime":"2025-12-16T12:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.282222 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.282254 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.282263 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.282276 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.282284 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:50Z","lastTransitionTime":"2025-12-16T12:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.384093 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.384146 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.384163 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.384184 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.384199 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:50Z","lastTransitionTime":"2025-12-16T12:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.486896 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.486935 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.486944 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.486972 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.486981 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:50Z","lastTransitionTime":"2025-12-16T12:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.589916 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.589979 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.590001 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.590070 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.590090 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:50Z","lastTransitionTime":"2025-12-16T12:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.692120 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.692156 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.692164 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.692176 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.692184 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:50Z","lastTransitionTime":"2025-12-16T12:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.794660 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.794718 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.794729 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.794744 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.794757 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:50Z","lastTransitionTime":"2025-12-16T12:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.897365 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.897407 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.897418 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.897434 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.897445 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:50Z","lastTransitionTime":"2025-12-16T12:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.948822 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.948919 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:50 crc kubenswrapper[4757]: E1216 12:47:50.949241 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.948953 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:50 crc kubenswrapper[4757]: E1216 12:47:50.949541 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.948941 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:50 crc kubenswrapper[4757]: E1216 12:47:50.949752 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:50 crc kubenswrapper[4757]: E1216 12:47:50.949338 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.999503 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.999530 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.999541 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.999554 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:50 crc kubenswrapper[4757]: I1216 12:47:50.999564 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:50Z","lastTransitionTime":"2025-12-16T12:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.102144 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.102178 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.102190 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.102204 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.102214 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:51Z","lastTransitionTime":"2025-12-16T12:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.204602 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.204656 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.204670 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.204689 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.204715 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:51Z","lastTransitionTime":"2025-12-16T12:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.308087 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.308133 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.308144 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.308161 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.308173 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:51Z","lastTransitionTime":"2025-12-16T12:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.411856 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.411923 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.411936 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.411954 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.411998 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:51Z","lastTransitionTime":"2025-12-16T12:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.514194 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.514480 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.514568 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.514655 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.514745 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:51Z","lastTransitionTime":"2025-12-16T12:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.617595 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.617668 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.617691 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.617719 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.617740 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:51Z","lastTransitionTime":"2025-12-16T12:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.721173 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.721207 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.721215 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.721227 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.721237 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:51Z","lastTransitionTime":"2025-12-16T12:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.823211 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.823244 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.823256 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.823271 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.823282 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:51Z","lastTransitionTime":"2025-12-16T12:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.925262 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.925304 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.925315 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.925331 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:51 crc kubenswrapper[4757]: I1216 12:47:51.925342 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:51Z","lastTransitionTime":"2025-12-16T12:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.028419 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.028456 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.028467 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.028483 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.028493 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:52Z","lastTransitionTime":"2025-12-16T12:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.131184 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.131387 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.131454 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.131523 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.131591 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:52Z","lastTransitionTime":"2025-12-16T12:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.234113 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.234144 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.234154 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.234168 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.234179 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:52Z","lastTransitionTime":"2025-12-16T12:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.336784 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.336821 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.336832 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.336847 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.336857 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:52Z","lastTransitionTime":"2025-12-16T12:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.439828 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.439875 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.439884 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.439900 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.439909 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:52Z","lastTransitionTime":"2025-12-16T12:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.542322 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.542358 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.542367 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.542380 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.542390 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:52Z","lastTransitionTime":"2025-12-16T12:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.645359 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.645447 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.645463 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.645485 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.645500 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:52Z","lastTransitionTime":"2025-12-16T12:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.748047 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.748103 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.748113 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.748131 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.748142 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:52Z","lastTransitionTime":"2025-12-16T12:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.851513 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.851567 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.851575 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.851588 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.851597 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:52Z","lastTransitionTime":"2025-12-16T12:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.948856 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.949051 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.948953 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:52 crc kubenswrapper[4757]: E1216 12:47:52.949942 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.949961 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:52 crc kubenswrapper[4757]: E1216 12:47:52.950085 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:52 crc kubenswrapper[4757]: E1216 12:47:52.950252 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:52 crc kubenswrapper[4757]: E1216 12:47:52.950336 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.953070 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.953104 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.953143 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.953174 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:52 crc kubenswrapper[4757]: I1216 12:47:52.953184 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:52Z","lastTransitionTime":"2025-12-16T12:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.056861 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.056912 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.056934 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.056960 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.056981 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:53Z","lastTransitionTime":"2025-12-16T12:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.160108 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.160170 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.160183 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.160198 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.160208 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:53Z","lastTransitionTime":"2025-12-16T12:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.262937 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.263258 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.263342 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.263418 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.263499 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:53Z","lastTransitionTime":"2025-12-16T12:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.365604 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.365644 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.365653 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.365671 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.365680 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:53Z","lastTransitionTime":"2025-12-16T12:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.467643 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.467666 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.467674 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.467687 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.467696 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:53Z","lastTransitionTime":"2025-12-16T12:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.570554 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.570611 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.570622 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.570638 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.570673 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:53Z","lastTransitionTime":"2025-12-16T12:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.672834 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.672881 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.672893 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.672916 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.672928 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:53Z","lastTransitionTime":"2025-12-16T12:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.776232 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.777281 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.777382 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.777471 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.777553 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:53Z","lastTransitionTime":"2025-12-16T12:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.879742 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.880034 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.880136 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.880226 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.880286 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:53Z","lastTransitionTime":"2025-12-16T12:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.981971 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.982307 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.982393 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.982475 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:53 crc kubenswrapper[4757]: I1216 12:47:53.982561 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:53Z","lastTransitionTime":"2025-12-16T12:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.085955 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.086036 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.086054 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.086076 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.086092 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:54Z","lastTransitionTime":"2025-12-16T12:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.189027 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.189070 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.189082 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.189098 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.189108 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:54Z","lastTransitionTime":"2025-12-16T12:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.291677 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.291705 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.291713 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.291724 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.291750 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:54Z","lastTransitionTime":"2025-12-16T12:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.394102 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.394143 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.394151 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.394168 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.394179 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:54Z","lastTransitionTime":"2025-12-16T12:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.497405 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.497465 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.497476 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.497509 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.497522 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:54Z","lastTransitionTime":"2025-12-16T12:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.599809 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.599874 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.599884 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.599898 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.599907 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:54Z","lastTransitionTime":"2025-12-16T12:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.702269 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.702310 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.702323 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.702341 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.702353 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:54Z","lastTransitionTime":"2025-12-16T12:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.805712 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.806336 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.806463 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.806573 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.806814 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:54Z","lastTransitionTime":"2025-12-16T12:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.909289 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.909329 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.909339 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.909353 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.909364 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:54Z","lastTransitionTime":"2025-12-16T12:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.951230 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.951298 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.951351 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:54 crc kubenswrapper[4757]: E1216 12:47:54.951584 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:54 crc kubenswrapper[4757]: E1216 12:47:54.951705 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:54 crc kubenswrapper[4757]: E1216 12:47:54.951775 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.951220 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:54 crc kubenswrapper[4757]: E1216 12:47:54.951995 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.979746 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:43Z\\\",\\\"message\\\":\\\"services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1216 12:47:42.718187 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z]\\\\nI1216 12:47:42.718168 6376 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t465t_openshift-ovn-kubernetes(b876e35b-75f8-407e-bf25-f7b3c2024428)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:54Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:54 crc kubenswrapper[4757]: I1216 12:47:54.989863 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8c859be-7650-49fa-a810-1bd096153c33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41e73e0e8e60e28a121a58c2c062fd5a9c0511b830575351f44047d1fe49b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4319cc559caee7402a63a08e2e1136722b2dbec25d2388f64f7f1d55e46dde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8t5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:54Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.005689 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:55Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.011308 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.011342 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.011353 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.011367 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.011376 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:55Z","lastTransitionTime":"2025-12-16T12:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.016050 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7838f83-3690-4e67-9cee-00e7bd61a04e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928dcf59424909a1463c643874dd93e265bce2029cbf595ed81ad3a8fad2c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://220e8c9a89d13c302c004e356c18732f517190cc6651a0116e935d3304a0b566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47d0598704fd1564a6a744ba766dc51c45a02ba411a6a8b151cc615e52792632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4884b6fbb86d4ba984e0066dc8bd18f65626bce7244d43655d45980d4246a9c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4884b6fbb86d4ba984e0066dc8bd18f65626bce7244d43655d45980d4246a9c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:55Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.030121 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:55Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.038403 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:55Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.046904 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:55Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.061515 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:55Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.074266 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:55Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.086874 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:55Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.099475 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:55Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.113996 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:55Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.114279 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.114297 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.114306 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.114319 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.114329 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:55Z","lastTransitionTime":"2025-12-16T12:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.130341 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:55Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.142287 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k6rww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k6rww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:55Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.156475 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:55Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.168535 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:55Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.180925 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:55Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.193426 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:55Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.218216 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.218682 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.218758 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.218851 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.218931 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:55Z","lastTransitionTime":"2025-12-16T12:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.321718 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.322080 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.322216 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.322364 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.322486 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:55Z","lastTransitionTime":"2025-12-16T12:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.425900 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.425950 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.425962 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.425977 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.425990 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:55Z","lastTransitionTime":"2025-12-16T12:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.528263 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.528595 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.528694 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.528804 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.528891 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:55Z","lastTransitionTime":"2025-12-16T12:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.631596 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.631849 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.631946 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.632081 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.632183 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:55Z","lastTransitionTime":"2025-12-16T12:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.734273 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.734309 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.734317 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.734329 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.734337 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:55Z","lastTransitionTime":"2025-12-16T12:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.836684 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.836726 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.836734 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.836748 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.836759 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:55Z","lastTransitionTime":"2025-12-16T12:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.938766 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.938800 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.938812 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.938826 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:55 crc kubenswrapper[4757]: I1216 12:47:55.938836 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:55Z","lastTransitionTime":"2025-12-16T12:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.041314 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.041346 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.041355 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.041367 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.041376 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:56Z","lastTransitionTime":"2025-12-16T12:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.143343 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.143373 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.143383 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.143398 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.143408 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:56Z","lastTransitionTime":"2025-12-16T12:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.246292 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.246340 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.246349 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.246362 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.246371 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:56Z","lastTransitionTime":"2025-12-16T12:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.348800 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.349154 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.349245 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.349315 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.349450 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:56Z","lastTransitionTime":"2025-12-16T12:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.451372 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.451416 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.451427 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.451444 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.451455 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:56Z","lastTransitionTime":"2025-12-16T12:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.553763 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.553815 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.553833 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.553851 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.553863 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:56Z","lastTransitionTime":"2025-12-16T12:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.656882 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.656920 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.656932 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.656960 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.656972 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:56Z","lastTransitionTime":"2025-12-16T12:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.760110 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.760166 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.760178 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.760199 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.760213 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:56Z","lastTransitionTime":"2025-12-16T12:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.862366 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.862438 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.862454 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.862477 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.862490 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:56Z","lastTransitionTime":"2025-12-16T12:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.926642 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.926681 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.926698 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.926713 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.926731 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:56Z","lastTransitionTime":"2025-12-16T12:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:56 crc kubenswrapper[4757]: E1216 12:47:56.940274 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:56Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.947227 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.947303 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.947321 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.947350 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.947365 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:56Z","lastTransitionTime":"2025-12-16T12:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.948414 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.948593 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:56 crc kubenswrapper[4757]: E1216 12:47:56.948992 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.949217 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:56 crc kubenswrapper[4757]: E1216 12:47:56.948622 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:56 crc kubenswrapper[4757]: E1216 12:47:56.949486 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.949605 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:56 crc kubenswrapper[4757]: E1216 12:47:56.949862 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.954873 4757 scope.go:117] "RemoveContainer" containerID="9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff" Dec 16 12:47:56 crc kubenswrapper[4757]: E1216 12:47:56.956904 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t465t_openshift-ovn-kubernetes(b876e35b-75f8-407e-bf25-f7b3c2024428)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" Dec 16 12:47:56 crc kubenswrapper[4757]: E1216 12:47:56.970489 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:56Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.980227 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.980283 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.980295 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.980312 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:56 crc kubenswrapper[4757]: I1216 12:47:56.980323 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:56Z","lastTransitionTime":"2025-12-16T12:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:56 crc kubenswrapper[4757]: E1216 12:47:56.995992 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:56Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.000480 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.000516 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.000526 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.000539 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.000549 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:57Z","lastTransitionTime":"2025-12-16T12:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:57 crc kubenswrapper[4757]: E1216 12:47:57.016278 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:57Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.021031 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.021077 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.021090 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.021110 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.021124 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:57Z","lastTransitionTime":"2025-12-16T12:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:57 crc kubenswrapper[4757]: E1216 12:47:57.036418 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:47:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:57Z is after 2025-08-24T17:21:41Z" Dec 16 12:47:57 crc kubenswrapper[4757]: E1216 12:47:57.036590 4757 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.038165 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.038196 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.038205 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.038218 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.038228 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:57Z","lastTransitionTime":"2025-12-16T12:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.141190 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.141256 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.141268 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.141284 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.141296 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:57Z","lastTransitionTime":"2025-12-16T12:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.243661 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.243706 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.243719 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.243738 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.243749 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:57Z","lastTransitionTime":"2025-12-16T12:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.346156 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.346185 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.346193 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.346205 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.346213 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:57Z","lastTransitionTime":"2025-12-16T12:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.448403 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.448438 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.448448 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.448462 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.448474 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:57Z","lastTransitionTime":"2025-12-16T12:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.550606 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.550651 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.550660 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.550675 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.550684 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:57Z","lastTransitionTime":"2025-12-16T12:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.653422 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.653467 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.653478 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.653495 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.653508 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:57Z","lastTransitionTime":"2025-12-16T12:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.755672 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.755710 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.755720 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.755733 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.755743 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:57Z","lastTransitionTime":"2025-12-16T12:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.858697 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.858756 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.858770 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.858790 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.858802 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:57Z","lastTransitionTime":"2025-12-16T12:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.960559 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.960636 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.960653 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.960677 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:57 crc kubenswrapper[4757]: I1216 12:47:57.960694 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:57Z","lastTransitionTime":"2025-12-16T12:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.063256 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.063288 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.063297 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.063315 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.063327 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:58Z","lastTransitionTime":"2025-12-16T12:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.165826 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.165864 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.165875 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.165891 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.165902 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:58Z","lastTransitionTime":"2025-12-16T12:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.268862 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.268918 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.268928 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.268942 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.268951 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:58Z","lastTransitionTime":"2025-12-16T12:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.372414 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.372479 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.372497 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.372521 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.372532 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:58Z","lastTransitionTime":"2025-12-16T12:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.474554 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.474588 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.474597 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.474611 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.474620 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:58Z","lastTransitionTime":"2025-12-16T12:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.847963 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.848020 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.848030 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.848044 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.848054 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:58Z","lastTransitionTime":"2025-12-16T12:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.947904 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.947970 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:47:58 crc kubenswrapper[4757]: E1216 12:47:58.948047 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.948062 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:47:58 crc kubenswrapper[4757]: E1216 12:47:58.948119 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.947972 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:47:58 crc kubenswrapper[4757]: E1216 12:47:58.948198 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:47:58 crc kubenswrapper[4757]: E1216 12:47:58.948283 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.951180 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.951235 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.951247 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.951262 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:58 crc kubenswrapper[4757]: I1216 12:47:58.951273 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:58Z","lastTransitionTime":"2025-12-16T12:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.053893 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.054416 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.054481 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.054548 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.054602 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:59Z","lastTransitionTime":"2025-12-16T12:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.157220 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.157258 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.157267 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.157281 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.157291 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:59Z","lastTransitionTime":"2025-12-16T12:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.259956 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.260042 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.260052 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.260067 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.260080 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:59Z","lastTransitionTime":"2025-12-16T12:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.362377 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.362415 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.362426 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.362441 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.362454 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:59Z","lastTransitionTime":"2025-12-16T12:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.464230 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.464493 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.464565 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.464641 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.464722 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:59Z","lastTransitionTime":"2025-12-16T12:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.567583 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.567876 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.567964 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.568070 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.568165 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:59Z","lastTransitionTime":"2025-12-16T12:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.670381 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.670427 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.670438 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.670456 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.670469 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:59Z","lastTransitionTime":"2025-12-16T12:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.773166 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.773516 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.773612 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.773712 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.773786 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:59Z","lastTransitionTime":"2025-12-16T12:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.876643 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.876918 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.876998 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.877100 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.877165 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:59Z","lastTransitionTime":"2025-12-16T12:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.980177 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.980557 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.980656 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.980752 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:47:59 crc kubenswrapper[4757]: I1216 12:47:59.980856 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:47:59Z","lastTransitionTime":"2025-12-16T12:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.083507 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.083555 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.083566 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.083585 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.083593 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:00Z","lastTransitionTime":"2025-12-16T12:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.186287 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.186323 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.186333 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.186350 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.186361 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:00Z","lastTransitionTime":"2025-12-16T12:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.288648 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.288686 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.288698 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.288710 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.288720 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:00Z","lastTransitionTime":"2025-12-16T12:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.391039 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.391258 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.391344 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.391426 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.391509 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:00Z","lastTransitionTime":"2025-12-16T12:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.493832 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.493907 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.493916 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.493930 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.493939 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:00Z","lastTransitionTime":"2025-12-16T12:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.596253 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.596412 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.596427 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.596448 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.596467 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:00Z","lastTransitionTime":"2025-12-16T12:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.699241 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.699286 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.699300 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.699317 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.699330 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:00Z","lastTransitionTime":"2025-12-16T12:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.750094 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs\") pod \"network-metrics-daemon-k6rww\" (UID: \"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\") " pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:00 crc kubenswrapper[4757]: E1216 12:48:00.750283 4757 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 12:48:00 crc kubenswrapper[4757]: E1216 12:48:00.750361 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs podName:0c1b0cca-3853-4bcf-8389-2fa9c754b5e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:48:32.750343355 +0000 UTC m=+98.178087151 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs") pod "network-metrics-daemon-k6rww" (UID: "0c1b0cca-3853-4bcf-8389-2fa9c754b5e8") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.802073 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.802139 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.802148 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.802162 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.802171 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:00Z","lastTransitionTime":"2025-12-16T12:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.904853 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.904894 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.904904 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.904943 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.904953 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:00Z","lastTransitionTime":"2025-12-16T12:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.948553 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.948667 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.948561 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:00 crc kubenswrapper[4757]: E1216 12:48:00.948726 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:00 crc kubenswrapper[4757]: E1216 12:48:00.948667 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:00 crc kubenswrapper[4757]: E1216 12:48:00.948781 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:00 crc kubenswrapper[4757]: I1216 12:48:00.948570 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:00 crc kubenswrapper[4757]: E1216 12:48:00.949174 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.007353 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.007610 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.007674 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.007740 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.007805 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:01Z","lastTransitionTime":"2025-12-16T12:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.110155 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.110678 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.110785 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.110862 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.110923 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:01Z","lastTransitionTime":"2025-12-16T12:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.213168 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.213398 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.213406 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.213423 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.213434 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:01Z","lastTransitionTime":"2025-12-16T12:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.315727 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.315776 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.315784 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.315801 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.315810 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:01Z","lastTransitionTime":"2025-12-16T12:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.417552 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.417588 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.417597 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.417610 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.417620 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:01Z","lastTransitionTime":"2025-12-16T12:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.520452 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.520483 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.520494 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.520514 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.520524 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:01Z","lastTransitionTime":"2025-12-16T12:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.622973 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.623106 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.623134 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.623160 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.623181 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:01Z","lastTransitionTime":"2025-12-16T12:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.725587 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.725622 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.725632 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.725644 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.725653 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:01Z","lastTransitionTime":"2025-12-16T12:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.828602 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.828648 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.828670 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.828686 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.828698 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:01Z","lastTransitionTime":"2025-12-16T12:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.931192 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.931247 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.931260 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.931278 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:01 crc kubenswrapper[4757]: I1216 12:48:01.931291 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:01Z","lastTransitionTime":"2025-12-16T12:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.033733 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.033785 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.033793 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.033807 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.033816 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:02Z","lastTransitionTime":"2025-12-16T12:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.135753 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.135808 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.135836 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.135852 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.135862 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:02Z","lastTransitionTime":"2025-12-16T12:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.237956 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.237993 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.238030 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.238047 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.238061 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:02Z","lastTransitionTime":"2025-12-16T12:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.340765 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.340808 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.340819 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.340836 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.340848 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:02Z","lastTransitionTime":"2025-12-16T12:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.416704 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cz9q7_395610a4-58ca-497e-93a6-714bd6c111c1/kube-multus/0.log" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.416755 4757 generic.go:334] "Generic (PLEG): container finished" podID="395610a4-58ca-497e-93a6-714bd6c111c1" containerID="6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866" exitCode=1 Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.416786 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cz9q7" event={"ID":"395610a4-58ca-497e-93a6-714bd6c111c1","Type":"ContainerDied","Data":"6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866"} Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.417207 4757 scope.go:117] "RemoveContainer" containerID="6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.435404 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:02Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.444919 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.445391 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.445407 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.445423 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.445433 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:02Z","lastTransitionTime":"2025-12-16T12:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.463846 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:43Z\\\",\\\"message\\\":\\\"services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1216 12:47:42.718187 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z]\\\\nI1216 12:47:42.718168 6376 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t465t_openshift-ovn-kubernetes(b876e35b-75f8-407e-bf25-f7b3c2024428)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:02Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.475331 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8c859be-7650-49fa-a810-1bd096153c33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41e73e0e8e60e28a121a58c2c062fd5a9c0511b830575351f44047d1fe49b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4319cc559caee7402a63a08e2e1136722b2dbec25d2388f64f7f1d55e46dde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8t5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:02Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.499654 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:02Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.514803 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7838f83-3690-4e67-9cee-00e7bd61a04e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928dcf59424909a1463c643874dd93e265bce2029cbf595ed81ad3a8fad2c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://220e8c9a89d13c302c004e356c18732f517190cc6651a0116e935d3304a0b566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47d0598704fd1564a6a744ba766dc51c45a02ba411a6a8b151cc615e52792632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4884b6fbb86d4ba984e0066dc8bd18f65626bce7244d43655d45980d4246a9c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4884b6fbb86d4ba984e0066dc8bd18f65626bce7244d43655d45980d4246a9c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:02Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.526075 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:02Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.535849 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:02Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.547725 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.547751 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.547759 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.547771 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.547779 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:02Z","lastTransitionTime":"2025-12-16T12:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.548288 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:02Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.561461 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:02Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.577882 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:02Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.605259 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:02Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.620056 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:48:02Z\\\",\\\"message\\\":\\\"2025-12-16T12:47:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7dd87b8e-1b5c-4a1b-8bb5-cf897596e92f\\\\n2025-12-16T12:47:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7dd87b8e-1b5c-4a1b-8bb5-cf897596e92f to /host/opt/cni/bin/\\\\n2025-12-16T12:47:17Z [verbose] multus-daemon started\\\\n2025-12-16T12:47:17Z [verbose] Readiness Indicator file check\\\\n2025-12-16T12:48:02Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:02Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.635560 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:02Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.648479 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:02Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.649768 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.649801 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.649811 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.649826 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.649838 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:02Z","lastTransitionTime":"2025-12-16T12:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.659106 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k6rww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k6rww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:02Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.670281 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:02Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.682076 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:02Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.696896 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:02Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.751737 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.751768 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.751776 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.751789 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.751798 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:02Z","lastTransitionTime":"2025-12-16T12:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.853480 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.853525 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.853539 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.853560 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.853576 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:02Z","lastTransitionTime":"2025-12-16T12:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.950792 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:02 crc kubenswrapper[4757]: E1216 12:48:02.950907 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.951115 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:02 crc kubenswrapper[4757]: E1216 12:48:02.951177 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.951315 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:02 crc kubenswrapper[4757]: E1216 12:48:02.951425 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.951742 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:02 crc kubenswrapper[4757]: E1216 12:48:02.951806 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.955596 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.955625 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.955635 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.955649 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:02 crc kubenswrapper[4757]: I1216 12:48:02.955659 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:02Z","lastTransitionTime":"2025-12-16T12:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.058402 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.058430 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.058438 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.058451 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.058459 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:03Z","lastTransitionTime":"2025-12-16T12:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.160232 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.160255 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.160263 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.160324 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.160334 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:03Z","lastTransitionTime":"2025-12-16T12:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.262900 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.262939 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.262947 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.262961 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.262972 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:03Z","lastTransitionTime":"2025-12-16T12:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.365023 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.365059 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.365067 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.365079 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.365088 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:03Z","lastTransitionTime":"2025-12-16T12:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.422316 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cz9q7_395610a4-58ca-497e-93a6-714bd6c111c1/kube-multus/0.log" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.422374 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cz9q7" event={"ID":"395610a4-58ca-497e-93a6-714bd6c111c1","Type":"ContainerStarted","Data":"e788f45ea79fe7471dd4c6124a4bd5c8025b259e461b847cb96dc6cd647203e3"} Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.435423 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:03Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.445731 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:03Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.456092 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:03Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.466863 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.466896 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.466905 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.466917 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.466925 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:03Z","lastTransitionTime":"2025-12-16T12:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.471641 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:03Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.484751 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:03Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.494822 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:03Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.505350 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e788f45ea79fe7471dd4c6124a4bd5c8025b259e461b847cb96dc6cd647203e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:48:02Z\\\",\\\"message\\\":\\\"2025-12-16T12:47:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7dd87b8e-1b5c-4a1b-8bb5-cf897596e92f\\\\n2025-12-16T12:47:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7dd87b8e-1b5c-4a1b-8bb5-cf897596e92f to /host/opt/cni/bin/\\\\n2025-12-16T12:47:17Z [verbose] multus-daemon started\\\\n2025-12-16T12:47:17Z [verbose] Readiness Indicator file check\\\\n2025-12-16T12:48:02Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:48:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:03Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.519302 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:03Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.531195 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:03Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.542144 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k6rww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k6rww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:03Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.553860 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:03Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.564263 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:03Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.572554 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.572583 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.572592 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.572608 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.572617 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:03Z","lastTransitionTime":"2025-12-16T12:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.575575 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:03Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.587238 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:03Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.607927 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:43Z\\\",\\\"message\\\":\\\"services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1216 12:47:42.718187 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z]\\\\nI1216 12:47:42.718168 6376 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t465t_openshift-ovn-kubernetes(b876e35b-75f8-407e-bf25-f7b3c2024428)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:03Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.619354 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8c859be-7650-49fa-a810-1bd096153c33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41e73e0e8e60e28a121a58c2c062fd5a9c0511b830575351f44047d1fe49b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4319cc559caee7402a63a08e2e1136722b2dbec25d2388f64f7f1d55e46dde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8t5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:03Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.638312 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:03Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.650863 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7838f83-3690-4e67-9cee-00e7bd61a04e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928dcf59424909a1463c643874dd93e265bce2029cbf595ed81ad3a8fad2c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://220e8c9a89d13c302c004e356c18732f517190cc6651a0116e935d3304a0b566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47d0598704fd1564a6a744ba766dc51c45a02ba411a6a8b151cc615e52792632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4884b6fbb86d4ba984e0066dc8bd18f65626bce7244d43655d45980d4246a9c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4884b6fbb86d4ba984e0066dc8bd18f65626bce7244d43655d45980d4246a9c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:03Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.675087 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.675118 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.675130 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.675145 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.675156 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:03Z","lastTransitionTime":"2025-12-16T12:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.777144 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.777182 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.777194 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.777209 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.777221 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:03Z","lastTransitionTime":"2025-12-16T12:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.879737 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.879776 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.879789 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.879803 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.879813 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:03Z","lastTransitionTime":"2025-12-16T12:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.981809 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.981834 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.981843 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.981856 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:03 crc kubenswrapper[4757]: I1216 12:48:03.981866 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:03Z","lastTransitionTime":"2025-12-16T12:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.083743 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.083789 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.083801 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.083819 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.083831 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:04Z","lastTransitionTime":"2025-12-16T12:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.186088 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.186136 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.186145 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.186158 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.186183 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:04Z","lastTransitionTime":"2025-12-16T12:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.289580 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.289649 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.289667 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.289700 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.289717 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:04Z","lastTransitionTime":"2025-12-16T12:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.391852 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.392110 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.392192 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.392267 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.392337 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:04Z","lastTransitionTime":"2025-12-16T12:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.494383 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.494413 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.494423 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.494436 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.494445 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:04Z","lastTransitionTime":"2025-12-16T12:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.596211 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.596240 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.596248 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.596261 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.596270 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:04Z","lastTransitionTime":"2025-12-16T12:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.698276 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.698319 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.698331 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.698348 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.698361 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:04Z","lastTransitionTime":"2025-12-16T12:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.801091 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.801126 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.801136 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.801151 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.801163 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:04Z","lastTransitionTime":"2025-12-16T12:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.904458 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.904496 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.904507 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.904521 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.904532 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:04Z","lastTransitionTime":"2025-12-16T12:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.948258 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:04 crc kubenswrapper[4757]: E1216 12:48:04.948700 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.948344 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.948374 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.948294 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:04 crc kubenswrapper[4757]: E1216 12:48:04.949512 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:04 crc kubenswrapper[4757]: E1216 12:48:04.949647 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:04 crc kubenswrapper[4757]: E1216 12:48:04.949753 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.960891 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:04Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.973433 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k6rww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k6rww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:04Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.984946 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:04Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:04 crc kubenswrapper[4757]: I1216 12:48:04.997142 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:04Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.007726 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.008147 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.008406 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.008669 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.008904 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:05Z","lastTransitionTime":"2025-12-16T12:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.012427 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:05Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.027709 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:05Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.059268 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:43Z\\\",\\\"message\\\":\\\"services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1216 12:47:42.718187 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z]\\\\nI1216 12:47:42.718168 6376 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t465t_openshift-ovn-kubernetes(b876e35b-75f8-407e-bf25-f7b3c2024428)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:05Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.070498 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8c859be-7650-49fa-a810-1bd096153c33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41e73e0e8e60e28a121a58c2c062fd5a9c0511b830575351f44047d1fe49b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4319cc559caee7402a63a08e2e1136722b2dbec25d2388f64f7f1d55e46dde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8t5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:05Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.089933 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:05Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.101922 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7838f83-3690-4e67-9cee-00e7bd61a04e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928dcf59424909a1463c643874dd93e265bce2029cbf595ed81ad3a8fad2c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://220e8c9a89d13c302c004e356c18732f517190cc6651a0116e935d3304a0b566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47d0598704fd1564a6a744ba766dc51c45a02ba411a6a8b151cc615e52792632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4884b6fbb86d4ba984e0066dc8bd18f65626bce7244d43655d45980d4246a9c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4884b6fbb86d4ba984e0066dc8bd18f65626bce7244d43655d45980d4246a9c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:05Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.111743 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.111808 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.111821 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.111839 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.111851 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:05Z","lastTransitionTime":"2025-12-16T12:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.115991 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:05Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.126300 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:05Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.137311 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:05Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.152483 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:05Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.168077 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:05Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.179663 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:05Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.191812 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e788f45ea79fe7471dd4c6124a4bd5c8025b259e461b847cb96dc6cd647203e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:48:02Z\\\",\\\"message\\\":\\\"2025-12-16T12:47:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7dd87b8e-1b5c-4a1b-8bb5-cf897596e92f\\\\n2025-12-16T12:47:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7dd87b8e-1b5c-4a1b-8bb5-cf897596e92f to /host/opt/cni/bin/\\\\n2025-12-16T12:47:17Z [verbose] multus-daemon started\\\\n2025-12-16T12:47:17Z [verbose] Readiness Indicator file check\\\\n2025-12-16T12:48:02Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:48:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:05Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.210965 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:05Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.218259 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.218309 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.218319 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.218345 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.218356 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:05Z","lastTransitionTime":"2025-12-16T12:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.320018 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.320054 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.320063 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.320076 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.320084 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:05Z","lastTransitionTime":"2025-12-16T12:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.422242 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.422287 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.422299 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.422317 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.422328 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:05Z","lastTransitionTime":"2025-12-16T12:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.525091 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.525144 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.525155 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.525171 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.525182 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:05Z","lastTransitionTime":"2025-12-16T12:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.627503 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.627536 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.627544 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.627557 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.627566 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:05Z","lastTransitionTime":"2025-12-16T12:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.730076 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.730119 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.730129 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.730142 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.730152 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:05Z","lastTransitionTime":"2025-12-16T12:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.832784 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.832820 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.832832 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.832850 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.832863 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:05Z","lastTransitionTime":"2025-12-16T12:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.935559 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.935613 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.935624 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.935643 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:05 crc kubenswrapper[4757]: I1216 12:48:05.935655 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:05Z","lastTransitionTime":"2025-12-16T12:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.038911 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.038954 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.038965 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.038981 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.038993 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:06Z","lastTransitionTime":"2025-12-16T12:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.141125 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.141166 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.141177 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.141192 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.141202 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:06Z","lastTransitionTime":"2025-12-16T12:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.243941 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.243989 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.243998 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.244027 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.244037 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:06Z","lastTransitionTime":"2025-12-16T12:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.346446 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.346480 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.346493 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.346508 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.346521 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:06Z","lastTransitionTime":"2025-12-16T12:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.447961 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.447989 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.447998 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.448027 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.448037 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:06Z","lastTransitionTime":"2025-12-16T12:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.550431 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.550481 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.550491 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.550504 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.550512 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:06Z","lastTransitionTime":"2025-12-16T12:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.652466 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.652496 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.652506 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.652521 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.652531 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:06Z","lastTransitionTime":"2025-12-16T12:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.754901 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.754935 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.754946 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.754963 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.754976 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:06Z","lastTransitionTime":"2025-12-16T12:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.857789 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.857834 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.857844 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.857861 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.857872 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:06Z","lastTransitionTime":"2025-12-16T12:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.948654 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.948725 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.948863 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.948927 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:06 crc kubenswrapper[4757]: E1216 12:48:06.948995 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:06 crc kubenswrapper[4757]: E1216 12:48:06.949171 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:06 crc kubenswrapper[4757]: E1216 12:48:06.949225 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:06 crc kubenswrapper[4757]: E1216 12:48:06.949302 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.959903 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.961448 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.961479 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.961488 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.961501 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:06 crc kubenswrapper[4757]: I1216 12:48:06.961509 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:06Z","lastTransitionTime":"2025-12-16T12:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.064097 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.064144 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.064156 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.064182 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.064195 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:07Z","lastTransitionTime":"2025-12-16T12:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.166584 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.166632 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.166646 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.166669 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.166713 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:07Z","lastTransitionTime":"2025-12-16T12:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.270514 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.270564 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.270579 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.270595 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.270606 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:07Z","lastTransitionTime":"2025-12-16T12:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.281607 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.281649 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.281660 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.281674 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.281686 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:07Z","lastTransitionTime":"2025-12-16T12:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:07 crc kubenswrapper[4757]: E1216 12:48:07.299245 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:07Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.304042 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.304085 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.304094 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.304109 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.304118 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:07Z","lastTransitionTime":"2025-12-16T12:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:07 crc kubenswrapper[4757]: E1216 12:48:07.315854 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:07Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.319553 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.319609 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.319619 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.319634 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.319660 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:07Z","lastTransitionTime":"2025-12-16T12:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:07 crc kubenswrapper[4757]: E1216 12:48:07.333305 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:07Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.336375 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.336401 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.336411 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.336425 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.336435 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:07Z","lastTransitionTime":"2025-12-16T12:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:07 crc kubenswrapper[4757]: E1216 12:48:07.351888 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:07Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.355207 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.355253 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.355262 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.355278 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.355288 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:07Z","lastTransitionTime":"2025-12-16T12:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:07 crc kubenswrapper[4757]: E1216 12:48:07.366686 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:07Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:07 crc kubenswrapper[4757]: E1216 12:48:07.366817 4757 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.375054 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.375100 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.375110 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.375125 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.375135 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:07Z","lastTransitionTime":"2025-12-16T12:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.477287 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.477345 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.477355 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.477372 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.477382 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:07Z","lastTransitionTime":"2025-12-16T12:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.580286 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.580350 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.580359 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.580372 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.580383 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:07Z","lastTransitionTime":"2025-12-16T12:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.683427 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.683471 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.683481 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.683498 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.683508 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:07Z","lastTransitionTime":"2025-12-16T12:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.785831 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.785886 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.785903 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.785925 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.785941 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:07Z","lastTransitionTime":"2025-12-16T12:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.888459 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.888493 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.888504 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.888519 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.888530 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:07Z","lastTransitionTime":"2025-12-16T12:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.949737 4757 scope.go:117] "RemoveContainer" containerID="9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.992181 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.992219 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.992234 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.992251 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:07 crc kubenswrapper[4757]: I1216 12:48:07.992287 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:07Z","lastTransitionTime":"2025-12-16T12:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.095057 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.095092 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.095103 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.095118 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.095131 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:08Z","lastTransitionTime":"2025-12-16T12:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.196977 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.197033 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.197043 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.197057 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.197068 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:08Z","lastTransitionTime":"2025-12-16T12:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.299102 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.299131 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.299139 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.299152 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.299161 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:08Z","lastTransitionTime":"2025-12-16T12:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.401190 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.401218 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.401226 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.401238 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.401246 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:08Z","lastTransitionTime":"2025-12-16T12:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.505780 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.505817 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.505826 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.505840 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.505850 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:08Z","lastTransitionTime":"2025-12-16T12:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.608326 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.608366 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.608374 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.608386 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.608396 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:08Z","lastTransitionTime":"2025-12-16T12:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.710426 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.710461 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.710471 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.710485 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.710497 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:08Z","lastTransitionTime":"2025-12-16T12:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.813204 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.813233 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.813241 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.813254 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.813264 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:08Z","lastTransitionTime":"2025-12-16T12:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.915365 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.915397 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.915405 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.915418 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.915427 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:08Z","lastTransitionTime":"2025-12-16T12:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.949177 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:08 crc kubenswrapper[4757]: E1216 12:48:08.949311 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.949487 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:08 crc kubenswrapper[4757]: E1216 12:48:08.949543 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.949644 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:08 crc kubenswrapper[4757]: E1216 12:48:08.949691 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:08 crc kubenswrapper[4757]: I1216 12:48:08.949807 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:08 crc kubenswrapper[4757]: E1216 12:48:08.949877 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.021039 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.021076 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.021087 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.021107 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.021117 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:09Z","lastTransitionTime":"2025-12-16T12:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.123574 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.123607 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.123616 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.123632 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.123643 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:09Z","lastTransitionTime":"2025-12-16T12:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.225927 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.225979 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.225988 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.226016 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.226026 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:09Z","lastTransitionTime":"2025-12-16T12:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.328417 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.328451 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.328460 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.328474 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.328486 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:09Z","lastTransitionTime":"2025-12-16T12:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.430819 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.430846 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.430856 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.430869 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.430877 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:09Z","lastTransitionTime":"2025-12-16T12:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.446042 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovnkube-controller/2.log" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.448480 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerStarted","Data":"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d"} Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.462448 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7838f83-3690-4e67-9cee-00e7bd61a04e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928dcf59424909a1463c643874dd93e265bce2029cbf595ed81ad3a8fad2c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://220e8c9a89d13c302c004e356c18732f517190cc6651a0116e935d3304a0b566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47d0598704fd1564a6a744ba766dc51c45a02ba411a6a8b151cc615e52792632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4884b6fbb86d4ba984e0066dc8bd18f65626bce7244d43655d45980d4246a9c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4884b6fbb86d4ba984e0066dc8bd18f65626bce7244d43655d45980d4246a9c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.474855 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.493195 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:43Z\\\",\\\"message\\\":\\\"services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1216 12:47:42.718187 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z]\\\\nI1216 12:47:42.718168 6376 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.506083 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8c859be-7650-49fa-a810-1bd096153c33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41e73e0e8e60e28a121a58c2c062fd5a9c0511b830575351f44047d1fe49b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4319cc559caee7402a63a08e2e1136722b2dbec25d2388f64f7f1d55e46dde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8t5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.520682 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97955ddc-f61f-446f-94b5-22b848319b87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f662a9a0f32f67d2c0c9018c4324bd101f82c13f1fc031a545ca559f4b1df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b37e293c8b418fa9c430f42d725bb63b8a9f22a3e42aad4939916896e3fbcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43b37e293c8b418fa9c430f42d725bb63b8a9f22a3e42aad4939916896e3fbcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.533394 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.533437 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.533448 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.533462 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.533470 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:09Z","lastTransitionTime":"2025-12-16T12:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.545871 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.557786 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.569377 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.579514 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.591480 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.604403 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.618241 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.632389 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e788f45ea79fe7471dd4c6124a4bd5c8025b259e461b847cb96dc6cd647203e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:48:02Z\\\",\\\"message\\\":\\\"2025-12-16T12:47:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7dd87b8e-1b5c-4a1b-8bb5-cf897596e92f\\\\n2025-12-16T12:47:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7dd87b8e-1b5c-4a1b-8bb5-cf897596e92f to /host/opt/cni/bin/\\\\n2025-12-16T12:47:17Z [verbose] multus-daemon started\\\\n2025-12-16T12:47:17Z [verbose] Readiness Indicator file check\\\\n2025-12-16T12:48:02Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:48:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.636224 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.636254 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.636262 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.636276 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.636285 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:09Z","lastTransitionTime":"2025-12-16T12:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.647483 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.660122 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.672721 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.683556 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k6rww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k6rww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.706782 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.719524 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:09Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.738085 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.738140 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.738149 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.738163 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.738171 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:09Z","lastTransitionTime":"2025-12-16T12:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.840357 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.840390 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.840400 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.840412 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.840421 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:09Z","lastTransitionTime":"2025-12-16T12:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.944386 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.944441 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.944456 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.944475 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:09 crc kubenswrapper[4757]: I1216 12:48:09.944491 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:09Z","lastTransitionTime":"2025-12-16T12:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.046872 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.046909 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.046921 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.046939 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.046952 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:10Z","lastTransitionTime":"2025-12-16T12:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.149279 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.149320 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.149331 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.149349 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.149360 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:10Z","lastTransitionTime":"2025-12-16T12:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.251619 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.251655 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.251664 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.251677 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.251690 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:10Z","lastTransitionTime":"2025-12-16T12:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.354365 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.354424 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.354441 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.354466 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.354482 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:10Z","lastTransitionTime":"2025-12-16T12:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.454493 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovnkube-controller/3.log" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.455442 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovnkube-controller/2.log" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.456717 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.456747 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.456763 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.456783 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.456794 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:10Z","lastTransitionTime":"2025-12-16T12:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.459358 4757 generic.go:334] "Generic (PLEG): container finished" podID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerID="d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d" exitCode=1 Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.459393 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerDied","Data":"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d"} Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.459420 4757 scope.go:117] "RemoveContainer" containerID="9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.461056 4757 scope.go:117] "RemoveContainer" containerID="d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d" Dec 16 12:48:10 crc kubenswrapper[4757]: E1216 12:48:10.461270 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t465t_openshift-ovn-kubernetes(b876e35b-75f8-407e-bf25-f7b3c2024428)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.485199 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.496811 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.511119 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.525406 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.538100 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.552658 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.560717 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.560753 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.560762 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.560778 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.560790 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:10Z","lastTransitionTime":"2025-12-16T12:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.570458 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e788f45ea79fe7471dd4c6124a4bd5c8025b259e461b847cb96dc6cd647203e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:48:02Z\\\",\\\"message\\\":\\\"2025-12-16T12:47:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7dd87b8e-1b5c-4a1b-8bb5-cf897596e92f\\\\n2025-12-16T12:47:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7dd87b8e-1b5c-4a1b-8bb5-cf897596e92f to /host/opt/cni/bin/\\\\n2025-12-16T12:47:17Z [verbose] multus-daemon started\\\\n2025-12-16T12:47:17Z [verbose] Readiness Indicator file check\\\\n2025-12-16T12:48:02Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:48:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.590251 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.603464 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.614450 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.627518 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.639402 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.650274 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k6rww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k6rww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.661588 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97955ddc-f61f-446f-94b5-22b848319b87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f662a9a0f32f67d2c0c9018c4324bd101f82c13f1fc031a545ca559f4b1df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b37e293c8b418fa9c430f42d725bb63b8a9f22a3e42aad4939916896e3fbcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43b37e293c8b418fa9c430f42d725bb63b8a9f22a3e42aad4939916896e3fbcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.663050 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.663117 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.663129 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.663148 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.663159 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:10Z","lastTransitionTime":"2025-12-16T12:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.687115 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.704237 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7838f83-3690-4e67-9cee-00e7bd61a04e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928dcf59424909a1463c643874dd93e265bce2029cbf595ed81ad3a8fad2c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://220e8c9a89d13c302c004e356c18732f517190cc6651a0116e935d3304a0b566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47d0598704fd1564a6a744ba766dc51c45a02ba411a6a8b151cc615e52792632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4884b6fbb86d4ba984e0066dc8bd18f65626bce7244d43655d45980d4246a9c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4884b6fbb86d4ba984e0066dc8bd18f65626bce7244d43655d45980d4246a9c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.723800 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.752188 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:43Z\\\",\\\"message\\\":\\\"services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1216 12:47:42.718187 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z]\\\\nI1216 12:47:42.718168 6376 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:48:09Z\\\",\\\"message\\\":\\\"cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:48:09.548273 6752 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:48:09.548349 6752 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:48:09.546767 6752 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1216 12:48:09.547790 6752 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nF1216 12:48:09.548512 6752 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.763242 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8c859be-7650-49fa-a810-1bd096153c33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41e73e0e8e60e28a121a58c2c062fd5a9c0511b830575351f44047d1fe49b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4319cc559caee7402a63a08e2e1136722b2dbec25d2388f64f7f1d55e46dde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8t5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:10Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.765752 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.765775 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.765785 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.765797 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.765806 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:10Z","lastTransitionTime":"2025-12-16T12:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.869486 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.869729 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.869843 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.869929 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.870026 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:10Z","lastTransitionTime":"2025-12-16T12:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.948474 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:10 crc kubenswrapper[4757]: E1216 12:48:10.948851 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.949186 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.949306 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.949262 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:10 crc kubenswrapper[4757]: E1216 12:48:10.949455 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:10 crc kubenswrapper[4757]: E1216 12:48:10.949551 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:10 crc kubenswrapper[4757]: E1216 12:48:10.949752 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.972525 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.972750 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.972848 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.972922 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:10 crc kubenswrapper[4757]: I1216 12:48:10.973042 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:10Z","lastTransitionTime":"2025-12-16T12:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.076018 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.076332 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.076431 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.076524 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.076634 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:11Z","lastTransitionTime":"2025-12-16T12:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.179429 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.179465 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.179474 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.179487 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.179496 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:11Z","lastTransitionTime":"2025-12-16T12:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.281671 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.281941 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.282048 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.282175 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.282257 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:11Z","lastTransitionTime":"2025-12-16T12:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.384541 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.384592 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.384601 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.384612 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.384630 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:11Z","lastTransitionTime":"2025-12-16T12:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.465043 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovnkube-controller/3.log" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.487820 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.487860 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.487872 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.487889 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.487901 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:11Z","lastTransitionTime":"2025-12-16T12:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.590873 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.591176 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.591273 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.591350 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.591431 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:11Z","lastTransitionTime":"2025-12-16T12:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.693838 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.694192 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.694269 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.694328 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.694380 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:11Z","lastTransitionTime":"2025-12-16T12:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.797871 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.797941 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.797964 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.798086 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.798115 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:11Z","lastTransitionTime":"2025-12-16T12:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.900294 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.900587 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.900673 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.900755 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:11 crc kubenswrapper[4757]: I1216 12:48:11.900843 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:11Z","lastTransitionTime":"2025-12-16T12:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.003815 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.003857 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.003866 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.003880 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.003888 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:12Z","lastTransitionTime":"2025-12-16T12:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.107123 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.107187 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.107236 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.107255 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.107268 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:12Z","lastTransitionTime":"2025-12-16T12:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.210722 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.210767 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.210779 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.210798 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.210811 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:12Z","lastTransitionTime":"2025-12-16T12:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.313767 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.313850 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.313873 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.313903 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.313925 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:12Z","lastTransitionTime":"2025-12-16T12:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.416410 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.416455 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.416466 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.416481 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.416498 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:12Z","lastTransitionTime":"2025-12-16T12:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.518867 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.518918 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.518928 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.518946 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.518958 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:12Z","lastTransitionTime":"2025-12-16T12:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.621048 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.621098 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.621115 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.621132 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.621143 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:12Z","lastTransitionTime":"2025-12-16T12:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.723699 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.723728 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.723737 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.723753 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.723770 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:12Z","lastTransitionTime":"2025-12-16T12:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.826495 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.826535 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.826549 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.826566 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.826578 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:12Z","lastTransitionTime":"2025-12-16T12:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.928933 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.929239 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.929335 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.929421 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.929511 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:12Z","lastTransitionTime":"2025-12-16T12:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.948642 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.948690 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.948753 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:12 crc kubenswrapper[4757]: E1216 12:48:12.948788 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:12 crc kubenswrapper[4757]: I1216 12:48:12.948726 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:12 crc kubenswrapper[4757]: E1216 12:48:12.948957 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:12 crc kubenswrapper[4757]: E1216 12:48:12.949052 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:12 crc kubenswrapper[4757]: E1216 12:48:12.949147 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.032019 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.032317 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.032410 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.032652 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.032744 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:13Z","lastTransitionTime":"2025-12-16T12:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.134776 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.134816 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.134826 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.134843 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.134854 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:13Z","lastTransitionTime":"2025-12-16T12:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.237041 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.237155 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.237166 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.237189 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.237202 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:13Z","lastTransitionTime":"2025-12-16T12:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.339981 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.340086 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.340107 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.340131 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.340145 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:13Z","lastTransitionTime":"2025-12-16T12:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.442368 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.442424 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.442434 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.442449 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.442460 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:13Z","lastTransitionTime":"2025-12-16T12:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.545933 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.545974 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.545984 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.546044 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.546060 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:13Z","lastTransitionTime":"2025-12-16T12:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.648957 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.649210 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.649282 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.649393 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.649465 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:13Z","lastTransitionTime":"2025-12-16T12:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.751461 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.751492 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.751509 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.751526 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.751546 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:13Z","lastTransitionTime":"2025-12-16T12:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.853513 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.853597 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.853607 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.853623 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.853632 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:13Z","lastTransitionTime":"2025-12-16T12:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.955391 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.955428 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.955466 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.955482 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:13 crc kubenswrapper[4757]: I1216 12:48:13.955494 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:13Z","lastTransitionTime":"2025-12-16T12:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.057788 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.057843 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.057857 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.057875 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.057889 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:14Z","lastTransitionTime":"2025-12-16T12:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.161026 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.161063 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.161071 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.161084 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.161094 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:14Z","lastTransitionTime":"2025-12-16T12:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.263776 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.263819 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.263829 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.263845 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.263856 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:14Z","lastTransitionTime":"2025-12-16T12:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.365986 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.366052 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.366062 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.366145 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.366159 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:14Z","lastTransitionTime":"2025-12-16T12:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.469833 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.470462 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.470557 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.470641 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.470745 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:14Z","lastTransitionTime":"2025-12-16T12:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.572496 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.572524 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.572545 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.572557 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.572566 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:14Z","lastTransitionTime":"2025-12-16T12:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.674886 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.674933 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.674945 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.674962 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.674974 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:14Z","lastTransitionTime":"2025-12-16T12:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.777914 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.777957 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.777969 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.777985 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.777996 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:14Z","lastTransitionTime":"2025-12-16T12:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.880360 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.880406 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.880420 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.880442 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.880456 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:14Z","lastTransitionTime":"2025-12-16T12:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.948477 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:14 crc kubenswrapper[4757]: E1216 12:48:14.948862 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.948521 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:14 crc kubenswrapper[4757]: E1216 12:48:14.949149 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.948477 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:14 crc kubenswrapper[4757]: E1216 12:48:14.949395 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.948630 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:14 crc kubenswrapper[4757]: E1216 12:48:14.949603 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.965061 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"149ec790-f813-4055-8986-3674f9b10732\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T12:47:12Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1216 12:47:07.352370 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 12:47:07.361153 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2047606038/tls.crt::/tmp/serving-cert-2047606038/tls.key\\\\\\\"\\\\nI1216 12:47:12.564712 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 12:47:12.569869 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 12:47:12.569898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 12:47:12.569932 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 12:47:12.569942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 12:47:12.584914 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1216 12:47:12.584934 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1216 12:47:12.584972 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584984 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 12:47:12.584992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 12:47:12.585019 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 12:47:12.585025 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 12:47:12.585028 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1216 12:47:12.587440 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.983574 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.983629 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.983639 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.983653 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.983663 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:14Z","lastTransitionTime":"2025-12-16T12:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.985434 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://162159aeddc0444a52b49e35a97b7ddb727e6b86186912f4a9b1b2b19b768893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f81f8f706a9d1ded3c703f5d7545a7b508e4b51223d31714014b4eb46742b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:14 crc kubenswrapper[4757]: I1216 12:48:14.997297 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63b7eab448b2a60ba4aec13b130a85228e05eee4889c76e964ddb729dcd4e679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:14Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.009842 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43be7319-eac3-4e51-9560-e12d51e97ca6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd68be12dfe94c20431c867c31389edf0353e1a6b174188c2047cec57243236e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sx9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tm6vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.022073 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xhz4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"494d6b1a-0610-4a79-be5d-3c7e54f5c2eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d473d68552fcbc1fbf5760e1173fc2ff900bcda437f962622dee7809f5689bf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tb266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xhz4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.035630 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6wv7w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e5d7a25-bcdb-4347-b67b-008b3a0c48f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71af6bf87dc0c8177aefb8cc61372a62c1eb5572ddb931a465a8f67b2d58fc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-smpc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6wv7w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.054518 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cz9q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"395610a4-58ca-497e-93a6-714bd6c111c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e788f45ea79fe7471dd4c6124a4bd5c8025b259e461b847cb96dc6cd647203e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:48:02Z\\\",\\\"message\\\":\\\"2025-12-16T12:47:16+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7dd87b8e-1b5c-4a1b-8bb5-cf897596e92f\\\\n2025-12-16T12:47:16+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7dd87b8e-1b5c-4a1b-8bb5-cf897596e92f to /host/opt/cni/bin/\\\\n2025-12-16T12:47:17Z [verbose] multus-daemon started\\\\n2025-12-16T12:47:17Z [verbose] Readiness Indicator file check\\\\n2025-12-16T12:48:02Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:48:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtgw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cz9q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.070281 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e68497b0-de41-4a06-a7ca-2944fded6bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccba2203b81be236b7c555c281ed3e731b2e12fbe204dbfe836065ab897821d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fb69d1df644503ecb833073b740c7e8b0c526877ba3bf414ea3f5e375fd0bb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1279a94596e3109769046882db1d4606851caa0901629f9093e478aab6ed00e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fcc830d4acd0919147f15aadb2d5dfd7e47cc34c0fb26a1337127ccde1e6fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04828289ca9269b5e32c6ac6b349b57ae095d9f2c50eb0c568489b08cec6431d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1296b34ffa0e869ee9815a27103b9e5854a5c28ca44701cc65ae47be12d535e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1c96646566912bb59286550bf0a645295481a94002207fa02a8603a0a4bbd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r8xqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8lq2b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.083997 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60de9dea-dbdc-44c1-94ce-b66bf37ff6ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f07c09629a4e0a8432559ffd8a7183c9f1e2601f02809812c054db5b2a0ce18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6d1fbd5ae19e9957672842c048d66d3cc7a338f07cd121e0339d04f3a385b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93bff66a8a87e119b03fb19c4031e1d78c9094f0d250571f36146c7f5a6a9e6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.086923 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.086958 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.086969 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.086984 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.087000 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:15Z","lastTransitionTime":"2025-12-16T12:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.095333 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.109603 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.123441 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.134779 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-k6rww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-k6rww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.150036 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97955ddc-f61f-446f-94b5-22b848319b87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f662a9a0f32f67d2c0c9018c4324bd101f82c13f1fc031a545ca559f4b1df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43b37e293c8b418fa9c430f42d725bb63b8a9f22a3e42aad4939916896e3fbcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43b37e293c8b418fa9c430f42d725bb63b8a9f22a3e42aad4939916896e3fbcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.174176 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18ad073-0315-4bc7-a05a-4c8c6ef60e94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3e732865af695ba159c0b1380932467a1665a399f79ab5db806c998b0510146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://702acfad518391fd7d20c5f0079422cb3f4f251eeb297a485011920728848954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1231232d29ee6c2b8989f8e9950f9e8b4772f52f11318f4c2c66149b3bea790c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1659eb810be892eef6472518c4f5a1fab7d7f54929053902cf2452ecdc80d480\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51bb9a52a8ccce154c4bef200e36babccd30e6df4cf6ede8e4e6789f27080dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dcc0833c3c543ae5de39e67d67dc2bca86f3b60dd7da28b1af422789ce05257\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://71bb45b0a83811c624c686372a0eb8d49a1d182d60035b0aab033164e21b39c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92f2dc7039356c9061dea6ca34eb4bce77281524fa963489a82f48a0fcdcc23\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.187178 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7838f83-3690-4e67-9cee-00e7bd61a04e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928dcf59424909a1463c643874dd93e265bce2029cbf595ed81ad3a8fad2c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://220e8c9a89d13c302c004e356c18732f517190cc6651a0116e935d3304a0b566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47d0598704fd1564a6a744ba766dc51c45a02ba411a6a8b151cc615e52792632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4884b6fbb86d4ba984e0066dc8bd18f65626bce7244d43655d45980d4246a9c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4884b6fbb86d4ba984e0066dc8bd18f65626bce7244d43655d45980d4246a9c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:46:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:46:55Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.189797 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.189819 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.189830 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.189844 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.189866 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:15Z","lastTransitionTime":"2025-12-16T12:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.199157 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0980344e51be980a85cd89ed97609011d76887700affe4beda3ac2689828c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.218241 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b876e35b-75f8-407e-bf25-f7b3c2024428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:15Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d48b8ad50a701b7e6374ead85d2855cb6a9c64a16a0311ea0cfd9e28d61c8ff\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:47:43Z\\\",\\\"message\\\":\\\"services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.119\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1216 12:47:42.718187 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:47:42Z is after 2025-08-24T17:21:41Z]\\\\nI1216 12:47:42.718168 6376 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T12:48:09Z\\\",\\\"message\\\":\\\"cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:48:09.548273 6752 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:48:09.548349 6752 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1216 12:48:09.546767 6752 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1216 12:48:09.547790 6752 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nF1216 12:48:09.548512 6752 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T12:48:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T12:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T12:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c58k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t465t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.230085 4757 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8c859be-7650-49fa-a810-1bd096153c33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T12:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b41e73e0e8e60e28a121a58c2c062fd5a9c0511b830575351f44047d1fe49b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4319cc559caee7402a63a08e2e1136722b2dbec25d2388f64f7f1d55e46dde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T12:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-85fnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T12:47:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8t5c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:15Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.292676 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.292711 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.292722 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.292737 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.292748 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:15Z","lastTransitionTime":"2025-12-16T12:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.394765 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.394805 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.394815 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.394831 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.394846 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:15Z","lastTransitionTime":"2025-12-16T12:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.497153 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.497213 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.497224 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.497236 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.497246 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:15Z","lastTransitionTime":"2025-12-16T12:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.599112 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.599161 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.599171 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.599186 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.599197 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:15Z","lastTransitionTime":"2025-12-16T12:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.701085 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.701144 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.701156 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.701173 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.701184 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:15Z","lastTransitionTime":"2025-12-16T12:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.805245 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.805292 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.805303 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.805318 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.805329 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:15Z","lastTransitionTime":"2025-12-16T12:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.908802 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.908886 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.908908 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.908937 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:15 crc kubenswrapper[4757]: I1216 12:48:15.908960 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:15Z","lastTransitionTime":"2025-12-16T12:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.012112 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.012162 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.012173 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.012191 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.012204 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:16Z","lastTransitionTime":"2025-12-16T12:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.114391 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.114442 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.114453 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.114468 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.114479 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:16Z","lastTransitionTime":"2025-12-16T12:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.216514 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.216552 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.216564 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.216580 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.216593 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:16Z","lastTransitionTime":"2025-12-16T12:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.318694 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.318744 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.318753 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.318768 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.318778 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:16Z","lastTransitionTime":"2025-12-16T12:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.420582 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.420621 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.420630 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.420659 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.420669 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:16Z","lastTransitionTime":"2025-12-16T12:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.522886 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.522954 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.522965 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.522981 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.522991 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:16Z","lastTransitionTime":"2025-12-16T12:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.625912 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.625961 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.625974 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.625990 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.626000 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:16Z","lastTransitionTime":"2025-12-16T12:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.728640 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.728724 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.728737 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.728754 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.728765 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:16Z","lastTransitionTime":"2025-12-16T12:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.831352 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.831403 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.831413 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.831426 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.831435 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:16Z","lastTransitionTime":"2025-12-16T12:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.933836 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.933869 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.933879 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.933894 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.933904 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:16Z","lastTransitionTime":"2025-12-16T12:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.949304 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.949403 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.949505 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:16 crc kubenswrapper[4757]: E1216 12:48:16.949617 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:16 crc kubenswrapper[4757]: I1216 12:48:16.949824 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:16 crc kubenswrapper[4757]: E1216 12:48:16.949934 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:16 crc kubenswrapper[4757]: E1216 12:48:16.950199 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:16 crc kubenswrapper[4757]: E1216 12:48:16.950866 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.018861 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.019155 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.019241 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:21.019201157 +0000 UTC m=+146.446944993 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.019328 4757 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.019419 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:49:21.019396034 +0000 UTC m=+146.447139830 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.019436 4757 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.019508 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:49:21.019497427 +0000 UTC m=+146.447241403 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.019330 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.036696 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.036732 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.036742 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.036756 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.036766 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:17Z","lastTransitionTime":"2025-12-16T12:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.120264 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.120311 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.120434 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.120471 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.120479 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.120485 4757 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.120499 4757 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.120512 4757 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.120545 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 12:49:21.120530924 +0000 UTC m=+146.548274720 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.120562 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 12:49:21.120555394 +0000 UTC m=+146.548299190 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.139124 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.139170 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.139181 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.139199 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.139210 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:17Z","lastTransitionTime":"2025-12-16T12:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.241877 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.241903 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.241911 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.241922 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.241931 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:17Z","lastTransitionTime":"2025-12-16T12:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.344807 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.344849 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.344857 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.344871 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.344880 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:17Z","lastTransitionTime":"2025-12-16T12:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.446809 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.446845 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.446864 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.446879 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.446890 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:17Z","lastTransitionTime":"2025-12-16T12:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.549632 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.549674 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.549691 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.549709 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.549721 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:17Z","lastTransitionTime":"2025-12-16T12:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.652367 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.652404 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.652415 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.652429 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.652441 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:17Z","lastTransitionTime":"2025-12-16T12:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.693369 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.693420 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.693441 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.693458 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.693470 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:17Z","lastTransitionTime":"2025-12-16T12:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.713791 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.717626 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.717660 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.717669 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.717681 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.717690 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:17Z","lastTransitionTime":"2025-12-16T12:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.729242 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.732997 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.733055 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.733065 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.733079 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.733088 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:17Z","lastTransitionTime":"2025-12-16T12:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.745737 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.748850 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.748888 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.748897 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.748909 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.748918 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:17Z","lastTransitionTime":"2025-12-16T12:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.761346 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.764620 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.764662 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.764674 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.764689 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.764702 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:17Z","lastTransitionTime":"2025-12-16T12:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.775624 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T12:48:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a609af24-e04e-486a-9383-84e6961dbf65\\\",\\\"systemUUID\\\":\\\"56973bd2-6cf5-45e5-a4b6-0f6a651ea1df\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T12:48:17Z is after 2025-08-24T17:21:41Z" Dec 16 12:48:17 crc kubenswrapper[4757]: E1216 12:48:17.775799 4757 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.777440 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.777490 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.777501 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.777521 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.777533 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:17Z","lastTransitionTime":"2025-12-16T12:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.879025 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.879057 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.879070 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.879083 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.879092 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:17Z","lastTransitionTime":"2025-12-16T12:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.981366 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.981406 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.981416 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.981432 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:17 crc kubenswrapper[4757]: I1216 12:48:17.981445 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:17Z","lastTransitionTime":"2025-12-16T12:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.084056 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.084087 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.084095 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.084108 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.084116 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:18Z","lastTransitionTime":"2025-12-16T12:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.186768 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.186825 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.186841 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.186862 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.186881 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:18Z","lastTransitionTime":"2025-12-16T12:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.289525 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.289560 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.289572 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.289589 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.289601 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:18Z","lastTransitionTime":"2025-12-16T12:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.391826 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.391877 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.391889 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.391904 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.391913 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:18Z","lastTransitionTime":"2025-12-16T12:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.493674 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.493724 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.493736 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.493753 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.493771 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:18Z","lastTransitionTime":"2025-12-16T12:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.596451 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.596499 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.596510 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.596526 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.596538 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:18Z","lastTransitionTime":"2025-12-16T12:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.698913 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.698988 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.699061 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.699092 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.699111 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:18Z","lastTransitionTime":"2025-12-16T12:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.801749 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.801783 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.801793 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.801805 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.801814 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:18Z","lastTransitionTime":"2025-12-16T12:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.904057 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.904104 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.904115 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.904132 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.904146 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:18Z","lastTransitionTime":"2025-12-16T12:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.948915 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.949101 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:18 crc kubenswrapper[4757]: E1216 12:48:18.949164 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.949178 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:18 crc kubenswrapper[4757]: I1216 12:48:18.950600 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:18 crc kubenswrapper[4757]: E1216 12:48:18.951058 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:18 crc kubenswrapper[4757]: E1216 12:48:18.952354 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:18 crc kubenswrapper[4757]: E1216 12:48:18.952619 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.006613 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.006688 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.006701 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.006716 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.006728 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:19Z","lastTransitionTime":"2025-12-16T12:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.108632 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.109099 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.109114 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.109131 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.109143 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:19Z","lastTransitionTime":"2025-12-16T12:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.212019 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.212081 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.212090 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.212105 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.212123 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:19Z","lastTransitionTime":"2025-12-16T12:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.314142 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.314174 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.314182 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.314196 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.314205 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:19Z","lastTransitionTime":"2025-12-16T12:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.416141 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.416184 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.416195 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.416212 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.416224 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:19Z","lastTransitionTime":"2025-12-16T12:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.518769 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.518839 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.518861 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.518889 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.518910 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:19Z","lastTransitionTime":"2025-12-16T12:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.621673 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.621730 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.621743 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.621764 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.621783 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:19Z","lastTransitionTime":"2025-12-16T12:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.724466 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.724512 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.724523 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.724538 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.724549 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:19Z","lastTransitionTime":"2025-12-16T12:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.826731 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.826784 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.826793 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.826806 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.826816 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:19Z","lastTransitionTime":"2025-12-16T12:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.929536 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.929590 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.929601 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.929618 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:19 crc kubenswrapper[4757]: I1216 12:48:19.929628 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:19Z","lastTransitionTime":"2025-12-16T12:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.032258 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.032300 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.032311 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.032337 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.032349 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:20Z","lastTransitionTime":"2025-12-16T12:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.135091 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.135135 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.135146 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.135157 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.135167 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:20Z","lastTransitionTime":"2025-12-16T12:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.237806 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.237841 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.237850 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.237862 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.237871 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:20Z","lastTransitionTime":"2025-12-16T12:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.339781 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.339830 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.339845 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.339860 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.339871 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:20Z","lastTransitionTime":"2025-12-16T12:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.441432 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.441473 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.441481 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.441494 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.441504 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:20Z","lastTransitionTime":"2025-12-16T12:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.543819 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.543885 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.543904 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.543918 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.543928 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:20Z","lastTransitionTime":"2025-12-16T12:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.645932 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.645995 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.646054 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.646069 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.646079 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:20Z","lastTransitionTime":"2025-12-16T12:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.748665 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.748724 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.748733 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.748745 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.748756 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:20Z","lastTransitionTime":"2025-12-16T12:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.852260 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.852348 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.852361 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.852379 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.852409 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:20Z","lastTransitionTime":"2025-12-16T12:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.948913 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.948985 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.949001 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.949212 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:20 crc kubenswrapper[4757]: E1216 12:48:20.949205 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:20 crc kubenswrapper[4757]: E1216 12:48:20.949362 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:20 crc kubenswrapper[4757]: E1216 12:48:20.949575 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:20 crc kubenswrapper[4757]: E1216 12:48:20.949697 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.953993 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.954022 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.954034 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.954062 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:20 crc kubenswrapper[4757]: I1216 12:48:20.954076 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:20Z","lastTransitionTime":"2025-12-16T12:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.057119 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.057234 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.057257 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.057281 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.057298 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:21Z","lastTransitionTime":"2025-12-16T12:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.160790 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.160844 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.160862 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.160889 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.160907 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:21Z","lastTransitionTime":"2025-12-16T12:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.263582 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.263631 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.263649 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.263670 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.263685 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:21Z","lastTransitionTime":"2025-12-16T12:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.366818 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.366868 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.366884 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.366909 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.366927 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:21Z","lastTransitionTime":"2025-12-16T12:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.420836 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.422599 4757 scope.go:117] "RemoveContainer" containerID="d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d" Dec 16 12:48:21 crc kubenswrapper[4757]: E1216 12:48:21.422872 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t465t_openshift-ovn-kubernetes(b876e35b-75f8-407e-bf25-f7b3c2024428)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.469460 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.469517 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.469532 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.469554 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.469570 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:21Z","lastTransitionTime":"2025-12-16T12:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.502282 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podStartSLOduration=67.50226835 podStartE2EDuration="1m7.50226835s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:48:21.501925818 +0000 UTC m=+86.929669624" watchObservedRunningTime="2025-12-16 12:48:21.50226835 +0000 UTC m=+86.930012146" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.516523 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xhz4k" podStartSLOduration=67.516475089 podStartE2EDuration="1m7.516475089s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:48:21.515464363 +0000 UTC m=+86.943208159" watchObservedRunningTime="2025-12-16 12:48:21.516475089 +0000 UTC m=+86.944218885" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.547149 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6wv7w" podStartSLOduration=67.547130886 podStartE2EDuration="1m7.547130886s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:48:21.527565095 +0000 UTC m=+86.955308961" watchObservedRunningTime="2025-12-16 12:48:21.547130886 +0000 UTC m=+86.974874682" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.571540 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.571569 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.571578 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.571590 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.571599 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:21Z","lastTransitionTime":"2025-12-16T12:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.572638 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8lq2b" podStartSLOduration=67.572627879 podStartE2EDuration="1m7.572627879s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:48:21.572426952 +0000 UTC m=+87.000170748" watchObservedRunningTime="2025-12-16 12:48:21.572627879 +0000 UTC m=+87.000371675" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.572837 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.572832056 podStartE2EDuration="1m8.572832056s" podCreationTimestamp="2025-12-16 12:47:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:48:21.548439652 +0000 UTC m=+86.976183448" watchObservedRunningTime="2025-12-16 12:48:21.572832056 +0000 UTC m=+87.000575852" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.597143 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cz9q7" podStartSLOduration=67.597121765 podStartE2EDuration="1m7.597121765s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:48:21.585941615 +0000 UTC m=+87.013685411" watchObservedRunningTime="2025-12-16 12:48:21.597121765 +0000 UTC m=+87.024865571" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.661851 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=66.661826761 podStartE2EDuration="1m6.661826761s" podCreationTimestamp="2025-12-16 12:47:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:48:21.661818411 +0000 UTC m=+87.089562217" watchObservedRunningTime="2025-12-16 12:48:21.661826761 +0000 UTC m=+87.089570557" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.673941 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.673976 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.673988 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.674008 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.674020 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:21Z","lastTransitionTime":"2025-12-16T12:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.684548 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=68.684531314 podStartE2EDuration="1m8.684531314s" podCreationTimestamp="2025-12-16 12:47:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:48:21.6835771 +0000 UTC m=+87.111320916" watchObservedRunningTime="2025-12-16 12:48:21.684531314 +0000 UTC m=+87.112275110" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.711779 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=36.711758428 podStartE2EDuration="36.711758428s" podCreationTimestamp="2025-12-16 12:47:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:48:21.699106376 +0000 UTC m=+87.126850192" watchObservedRunningTime="2025-12-16 12:48:21.711758428 +0000 UTC m=+87.139502224" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.753927 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8t5c" podStartSLOduration=67.753906967 podStartE2EDuration="1m7.753906967s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:48:21.744454818 +0000 UTC m=+87.172198614" watchObservedRunningTime="2025-12-16 12:48:21.753906967 +0000 UTC m=+87.181650763" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.776180 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.776212 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.776220 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.776233 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.776243 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:21Z","lastTransitionTime":"2025-12-16T12:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.879208 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.879277 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.879288 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.879306 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.879320 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:21Z","lastTransitionTime":"2025-12-16T12:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.981935 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.981968 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.981978 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.981991 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:21 crc kubenswrapper[4757]: I1216 12:48:21.981999 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:21Z","lastTransitionTime":"2025-12-16T12:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.085165 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.085506 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.085754 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.086125 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.086379 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:22Z","lastTransitionTime":"2025-12-16T12:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.189505 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.189806 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.189930 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.190128 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.190253 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:22Z","lastTransitionTime":"2025-12-16T12:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.292458 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.292532 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.292544 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.292560 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.292572 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:22Z","lastTransitionTime":"2025-12-16T12:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.394686 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.394753 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.394764 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.394801 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.394813 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:22Z","lastTransitionTime":"2025-12-16T12:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.497637 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.497692 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.497702 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.497715 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.497725 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:22Z","lastTransitionTime":"2025-12-16T12:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.599917 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.599951 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.599961 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.599975 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.599985 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:22Z","lastTransitionTime":"2025-12-16T12:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.702992 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.703102 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.703120 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.703143 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.703161 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:22Z","lastTransitionTime":"2025-12-16T12:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.806487 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.806525 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.806533 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.806548 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.806558 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:22Z","lastTransitionTime":"2025-12-16T12:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.909458 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.909495 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.909505 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.909521 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.909532 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:22Z","lastTransitionTime":"2025-12-16T12:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.948389 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:22 crc kubenswrapper[4757]: E1216 12:48:22.948513 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.948619 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.948659 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:22 crc kubenswrapper[4757]: I1216 12:48:22.948730 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:22 crc kubenswrapper[4757]: E1216 12:48:22.948943 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:22 crc kubenswrapper[4757]: E1216 12:48:22.949127 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:22 crc kubenswrapper[4757]: E1216 12:48:22.949197 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.011649 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.011717 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.011732 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.011768 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.011780 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:23Z","lastTransitionTime":"2025-12-16T12:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.114532 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.114813 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.114900 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.114978 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.115068 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:23Z","lastTransitionTime":"2025-12-16T12:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.218366 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.218424 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.218436 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.218450 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.218461 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:23Z","lastTransitionTime":"2025-12-16T12:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.321105 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.321153 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.321164 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.321184 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.321194 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:23Z","lastTransitionTime":"2025-12-16T12:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.423399 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.423442 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.423453 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.423471 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.423482 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:23Z","lastTransitionTime":"2025-12-16T12:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.526644 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.526689 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.526702 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.526717 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.526729 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:23Z","lastTransitionTime":"2025-12-16T12:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.629527 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.629826 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.629910 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.629988 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.630109 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:23Z","lastTransitionTime":"2025-12-16T12:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.731995 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.732063 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.732077 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.732110 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.732122 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:23Z","lastTransitionTime":"2025-12-16T12:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.834865 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.834916 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.834927 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.834942 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.834951 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:23Z","lastTransitionTime":"2025-12-16T12:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.937729 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.937996 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.938093 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.938188 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:23 crc kubenswrapper[4757]: I1216 12:48:23.938263 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:23Z","lastTransitionTime":"2025-12-16T12:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.040805 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.040852 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.040863 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.040878 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.040889 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:24Z","lastTransitionTime":"2025-12-16T12:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.143058 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.143317 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.143381 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.143443 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.143506 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:24Z","lastTransitionTime":"2025-12-16T12:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.245330 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.245375 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.245386 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.245403 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.245414 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:24Z","lastTransitionTime":"2025-12-16T12:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.348053 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.348322 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.348429 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.348538 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.348609 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:24Z","lastTransitionTime":"2025-12-16T12:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.451589 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.451806 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.452044 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.452215 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.452384 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:24Z","lastTransitionTime":"2025-12-16T12:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.556445 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.556771 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.556856 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.556945 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.557054 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:24Z","lastTransitionTime":"2025-12-16T12:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.660274 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.660330 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.660341 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.660356 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.660365 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:24Z","lastTransitionTime":"2025-12-16T12:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.763124 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.763181 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.763193 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.763213 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.763226 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:24Z","lastTransitionTime":"2025-12-16T12:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.865843 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.865877 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.865888 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.865903 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.865915 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:24Z","lastTransitionTime":"2025-12-16T12:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.948478 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.948658 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:24 crc kubenswrapper[4757]: E1216 12:48:24.949610 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.949693 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.949723 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:24 crc kubenswrapper[4757]: E1216 12:48:24.949859 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:24 crc kubenswrapper[4757]: E1216 12:48:24.950205 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:24 crc kubenswrapper[4757]: E1216 12:48:24.950328 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.969021 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.969060 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.969070 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.969083 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:24 crc kubenswrapper[4757]: I1216 12:48:24.969095 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:24Z","lastTransitionTime":"2025-12-16T12:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.071585 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.071621 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.071631 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.071644 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.071653 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:25Z","lastTransitionTime":"2025-12-16T12:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.173481 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.173712 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.173824 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.173924 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.174035 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:25Z","lastTransitionTime":"2025-12-16T12:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.276975 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.277035 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.277048 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.277064 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.277073 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:25Z","lastTransitionTime":"2025-12-16T12:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.379790 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.380140 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.380375 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.380540 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.380773 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:25Z","lastTransitionTime":"2025-12-16T12:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.484100 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.484135 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.484143 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.484157 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.484166 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:25Z","lastTransitionTime":"2025-12-16T12:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.586501 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.586545 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.586558 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.586577 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.586594 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:25Z","lastTransitionTime":"2025-12-16T12:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.689536 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.689815 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.689915 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.690046 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.690114 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:25Z","lastTransitionTime":"2025-12-16T12:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.792032 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.792070 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.792080 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.792095 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.792109 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:25Z","lastTransitionTime":"2025-12-16T12:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.894469 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.894517 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.894532 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.894553 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.894564 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:25Z","lastTransitionTime":"2025-12-16T12:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.996889 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.996948 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.996960 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.996977 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:25 crc kubenswrapper[4757]: I1216 12:48:25.996988 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:25Z","lastTransitionTime":"2025-12-16T12:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.098847 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.098894 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.098906 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.098922 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.098932 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:26Z","lastTransitionTime":"2025-12-16T12:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.200650 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.200693 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.200704 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.200718 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.200733 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:26Z","lastTransitionTime":"2025-12-16T12:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.303979 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.304039 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.304049 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.304062 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.304072 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:26Z","lastTransitionTime":"2025-12-16T12:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.406785 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.406826 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.406837 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.406853 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.406865 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:26Z","lastTransitionTime":"2025-12-16T12:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.509719 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.509752 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.509760 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.509775 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.509784 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:26Z","lastTransitionTime":"2025-12-16T12:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.612508 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.612547 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.612561 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.612578 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.612591 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:26Z","lastTransitionTime":"2025-12-16T12:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.715101 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.715151 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.715161 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.715177 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.715188 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:26Z","lastTransitionTime":"2025-12-16T12:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.823077 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.823148 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.823165 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.823223 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.823280 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:26Z","lastTransitionTime":"2025-12-16T12:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.925517 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.925722 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.925806 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.925869 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.925942 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:26Z","lastTransitionTime":"2025-12-16T12:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.948685 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.948790 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.948727 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:26 crc kubenswrapper[4757]: E1216 12:48:26.948921 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:26 crc kubenswrapper[4757]: E1216 12:48:26.949082 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:26 crc kubenswrapper[4757]: E1216 12:48:26.949212 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:26 crc kubenswrapper[4757]: I1216 12:48:26.949325 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:26 crc kubenswrapper[4757]: E1216 12:48:26.949500 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.028443 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.028473 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.028482 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.028516 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.028526 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:27Z","lastTransitionTime":"2025-12-16T12:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.130608 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.130885 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.130972 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.131175 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.131378 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:27Z","lastTransitionTime":"2025-12-16T12:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.234224 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.234267 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.234281 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.234297 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.234309 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:27Z","lastTransitionTime":"2025-12-16T12:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.336460 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.336523 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.336534 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.336551 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.336564 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:27Z","lastTransitionTime":"2025-12-16T12:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.439474 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.439517 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.439534 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.439555 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.439572 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:27Z","lastTransitionTime":"2025-12-16T12:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.541872 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.542259 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.542431 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.542575 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.542706 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:27Z","lastTransitionTime":"2025-12-16T12:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.645281 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.645358 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.645382 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.645416 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.645438 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:27Z","lastTransitionTime":"2025-12-16T12:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.748040 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.748069 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.748078 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.748091 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.748101 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:27Z","lastTransitionTime":"2025-12-16T12:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.851175 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.852325 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.852361 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.852392 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.852413 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:27Z","lastTransitionTime":"2025-12-16T12:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.913031 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.913069 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.913082 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.913097 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 12:48:27 crc kubenswrapper[4757]: I1216 12:48:27.913110 4757 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T12:48:27Z","lastTransitionTime":"2025-12-16T12:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.001219 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=22.001199998 podStartE2EDuration="22.001199998s" podCreationTimestamp="2025-12-16 12:48:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:48:21.755175102 +0000 UTC m=+87.182918898" watchObservedRunningTime="2025-12-16 12:48:28.001199998 +0000 UTC m=+93.428943794" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.001736 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98"] Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.002220 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.004541 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.004549 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.005104 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.005257 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.137322 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9279aa36-8ab8-439e-badd-d63e08b0c35f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lst98\" (UID: \"9279aa36-8ab8-439e-badd-d63e08b0c35f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.137387 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9279aa36-8ab8-439e-badd-d63e08b0c35f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lst98\" (UID: \"9279aa36-8ab8-439e-badd-d63e08b0c35f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.137422 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9279aa36-8ab8-439e-badd-d63e08b0c35f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lst98\" (UID: \"9279aa36-8ab8-439e-badd-d63e08b0c35f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.137443 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9279aa36-8ab8-439e-badd-d63e08b0c35f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lst98\" (UID: \"9279aa36-8ab8-439e-badd-d63e08b0c35f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.137463 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9279aa36-8ab8-439e-badd-d63e08b0c35f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lst98\" (UID: \"9279aa36-8ab8-439e-badd-d63e08b0c35f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.238675 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9279aa36-8ab8-439e-badd-d63e08b0c35f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lst98\" (UID: \"9279aa36-8ab8-439e-badd-d63e08b0c35f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.238718 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9279aa36-8ab8-439e-badd-d63e08b0c35f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lst98\" (UID: \"9279aa36-8ab8-439e-badd-d63e08b0c35f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.238742 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9279aa36-8ab8-439e-badd-d63e08b0c35f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lst98\" (UID: \"9279aa36-8ab8-439e-badd-d63e08b0c35f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.238748 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9279aa36-8ab8-439e-badd-d63e08b0c35f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lst98\" (UID: \"9279aa36-8ab8-439e-badd-d63e08b0c35f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.238955 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9279aa36-8ab8-439e-badd-d63e08b0c35f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lst98\" (UID: \"9279aa36-8ab8-439e-badd-d63e08b0c35f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.238999 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9279aa36-8ab8-439e-badd-d63e08b0c35f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lst98\" (UID: \"9279aa36-8ab8-439e-badd-d63e08b0c35f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.239184 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9279aa36-8ab8-439e-badd-d63e08b0c35f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lst98\" (UID: \"9279aa36-8ab8-439e-badd-d63e08b0c35f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.239669 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9279aa36-8ab8-439e-badd-d63e08b0c35f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lst98\" (UID: \"9279aa36-8ab8-439e-badd-d63e08b0c35f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.254514 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9279aa36-8ab8-439e-badd-d63e08b0c35f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lst98\" (UID: \"9279aa36-8ab8-439e-badd-d63e08b0c35f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.256894 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9279aa36-8ab8-439e-badd-d63e08b0c35f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lst98\" (UID: \"9279aa36-8ab8-439e-badd-d63e08b0c35f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.314968 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.524864 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" event={"ID":"9279aa36-8ab8-439e-badd-d63e08b0c35f","Type":"ContainerStarted","Data":"e09fc90e43c75eec743e4d5cb7d9b65f89486db0f4065f6f6f87c5168840652f"} Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.524963 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" event={"ID":"9279aa36-8ab8-439e-badd-d63e08b0c35f","Type":"ContainerStarted","Data":"8c2abc1238fa9e1c8019df31589a865b615293dc6863301f307d9406a5edc117"} Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.949288 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.949378 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.949377 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:28 crc kubenswrapper[4757]: E1216 12:48:28.949444 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:28 crc kubenswrapper[4757]: I1216 12:48:28.949479 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:28 crc kubenswrapper[4757]: E1216 12:48:28.949580 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:28 crc kubenswrapper[4757]: E1216 12:48:28.949703 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:28 crc kubenswrapper[4757]: E1216 12:48:28.949753 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:30 crc kubenswrapper[4757]: I1216 12:48:30.948824 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:30 crc kubenswrapper[4757]: I1216 12:48:30.948894 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:30 crc kubenswrapper[4757]: I1216 12:48:30.948835 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:30 crc kubenswrapper[4757]: E1216 12:48:30.948944 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:30 crc kubenswrapper[4757]: E1216 12:48:30.949067 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:30 crc kubenswrapper[4757]: I1216 12:48:30.949116 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:30 crc kubenswrapper[4757]: E1216 12:48:30.949236 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:30 crc kubenswrapper[4757]: E1216 12:48:30.949296 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:32 crc kubenswrapper[4757]: I1216 12:48:32.793970 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs\") pod \"network-metrics-daemon-k6rww\" (UID: \"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\") " pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:32 crc kubenswrapper[4757]: E1216 12:48:32.794177 4757 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 12:48:32 crc kubenswrapper[4757]: E1216 12:48:32.794272 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs podName:0c1b0cca-3853-4bcf-8389-2fa9c754b5e8 nodeName:}" failed. No retries permitted until 2025-12-16 12:49:36.794250397 +0000 UTC m=+162.221994273 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs") pod "network-metrics-daemon-k6rww" (UID: "0c1b0cca-3853-4bcf-8389-2fa9c754b5e8") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 12:48:32 crc kubenswrapper[4757]: I1216 12:48:32.948696 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:32 crc kubenswrapper[4757]: I1216 12:48:32.948742 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:32 crc kubenswrapper[4757]: I1216 12:48:32.948774 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:32 crc kubenswrapper[4757]: I1216 12:48:32.948837 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:32 crc kubenswrapper[4757]: E1216 12:48:32.949057 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:32 crc kubenswrapper[4757]: E1216 12:48:32.949142 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:32 crc kubenswrapper[4757]: E1216 12:48:32.949238 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:32 crc kubenswrapper[4757]: E1216 12:48:32.949653 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:32 crc kubenswrapper[4757]: I1216 12:48:32.949801 4757 scope.go:117] "RemoveContainer" containerID="d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d" Dec 16 12:48:32 crc kubenswrapper[4757]: E1216 12:48:32.949949 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t465t_openshift-ovn-kubernetes(b876e35b-75f8-407e-bf25-f7b3c2024428)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" Dec 16 12:48:34 crc kubenswrapper[4757]: I1216 12:48:34.948790 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:34 crc kubenswrapper[4757]: I1216 12:48:34.948793 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:34 crc kubenswrapper[4757]: I1216 12:48:34.948907 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:34 crc kubenswrapper[4757]: I1216 12:48:34.950321 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:34 crc kubenswrapper[4757]: E1216 12:48:34.950324 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:34 crc kubenswrapper[4757]: E1216 12:48:34.950529 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:34 crc kubenswrapper[4757]: E1216 12:48:34.950568 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:34 crc kubenswrapper[4757]: E1216 12:48:34.950648 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:36 crc kubenswrapper[4757]: I1216 12:48:36.948378 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:36 crc kubenswrapper[4757]: E1216 12:48:36.948518 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:36 crc kubenswrapper[4757]: I1216 12:48:36.948570 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:36 crc kubenswrapper[4757]: I1216 12:48:36.948652 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:36 crc kubenswrapper[4757]: E1216 12:48:36.948741 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:36 crc kubenswrapper[4757]: I1216 12:48:36.948823 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:36 crc kubenswrapper[4757]: E1216 12:48:36.948891 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:36 crc kubenswrapper[4757]: E1216 12:48:36.948983 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:38 crc kubenswrapper[4757]: I1216 12:48:38.948159 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:38 crc kubenswrapper[4757]: I1216 12:48:38.948210 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:38 crc kubenswrapper[4757]: I1216 12:48:38.948289 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:38 crc kubenswrapper[4757]: E1216 12:48:38.948968 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:38 crc kubenswrapper[4757]: E1216 12:48:38.948835 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:38 crc kubenswrapper[4757]: I1216 12:48:38.948349 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:38 crc kubenswrapper[4757]: E1216 12:48:38.949057 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:38 crc kubenswrapper[4757]: E1216 12:48:38.949252 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:40 crc kubenswrapper[4757]: I1216 12:48:40.948328 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:40 crc kubenswrapper[4757]: I1216 12:48:40.948399 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:40 crc kubenswrapper[4757]: I1216 12:48:40.948443 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:40 crc kubenswrapper[4757]: I1216 12:48:40.948343 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:40 crc kubenswrapper[4757]: E1216 12:48:40.948469 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:40 crc kubenswrapper[4757]: E1216 12:48:40.948546 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:40 crc kubenswrapper[4757]: E1216 12:48:40.948633 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:40 crc kubenswrapper[4757]: E1216 12:48:40.948715 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:42 crc kubenswrapper[4757]: I1216 12:48:42.948352 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:42 crc kubenswrapper[4757]: I1216 12:48:42.948352 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:42 crc kubenswrapper[4757]: E1216 12:48:42.949299 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:42 crc kubenswrapper[4757]: E1216 12:48:42.949333 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:42 crc kubenswrapper[4757]: I1216 12:48:42.948578 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:42 crc kubenswrapper[4757]: E1216 12:48:42.949427 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:42 crc kubenswrapper[4757]: I1216 12:48:42.948420 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:42 crc kubenswrapper[4757]: E1216 12:48:42.949518 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:44 crc kubenswrapper[4757]: I1216 12:48:44.948204 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:44 crc kubenswrapper[4757]: I1216 12:48:44.948250 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:44 crc kubenswrapper[4757]: I1216 12:48:44.948892 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:44 crc kubenswrapper[4757]: E1216 12:48:44.950886 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:44 crc kubenswrapper[4757]: I1216 12:48:44.950913 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:44 crc kubenswrapper[4757]: E1216 12:48:44.951049 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:44 crc kubenswrapper[4757]: E1216 12:48:44.951124 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:44 crc kubenswrapper[4757]: E1216 12:48:44.951224 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:46 crc kubenswrapper[4757]: I1216 12:48:46.948375 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:46 crc kubenswrapper[4757]: I1216 12:48:46.948455 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:46 crc kubenswrapper[4757]: I1216 12:48:46.948531 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:46 crc kubenswrapper[4757]: E1216 12:48:46.948681 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:46 crc kubenswrapper[4757]: E1216 12:48:46.948773 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:46 crc kubenswrapper[4757]: E1216 12:48:46.948886 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:46 crc kubenswrapper[4757]: I1216 12:48:46.949158 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:46 crc kubenswrapper[4757]: E1216 12:48:46.949257 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:47 crc kubenswrapper[4757]: I1216 12:48:47.949487 4757 scope.go:117] "RemoveContainer" containerID="d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d" Dec 16 12:48:47 crc kubenswrapper[4757]: E1216 12:48:47.949993 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t465t_openshift-ovn-kubernetes(b876e35b-75f8-407e-bf25-f7b3c2024428)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" Dec 16 12:48:48 crc kubenswrapper[4757]: I1216 12:48:48.593517 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cz9q7_395610a4-58ca-497e-93a6-714bd6c111c1/kube-multus/1.log" Dec 16 12:48:48 crc kubenswrapper[4757]: I1216 12:48:48.594402 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cz9q7_395610a4-58ca-497e-93a6-714bd6c111c1/kube-multus/0.log" Dec 16 12:48:48 crc kubenswrapper[4757]: I1216 12:48:48.594540 4757 generic.go:334] "Generic (PLEG): container finished" podID="395610a4-58ca-497e-93a6-714bd6c111c1" containerID="e788f45ea79fe7471dd4c6124a4bd5c8025b259e461b847cb96dc6cd647203e3" exitCode=1 Dec 16 12:48:48 crc kubenswrapper[4757]: I1216 12:48:48.594653 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cz9q7" event={"ID":"395610a4-58ca-497e-93a6-714bd6c111c1","Type":"ContainerDied","Data":"e788f45ea79fe7471dd4c6124a4bd5c8025b259e461b847cb96dc6cd647203e3"} Dec 16 12:48:48 crc kubenswrapper[4757]: I1216 12:48:48.594745 4757 scope.go:117] "RemoveContainer" containerID="6b65b8390392c1541b3d1effe5d68d4781ccc0f34d5c6f46518add42f6d13866" Dec 16 12:48:48 crc kubenswrapper[4757]: I1216 12:48:48.595228 4757 scope.go:117] "RemoveContainer" containerID="e788f45ea79fe7471dd4c6124a4bd5c8025b259e461b847cb96dc6cd647203e3" Dec 16 12:48:48 crc kubenswrapper[4757]: E1216 12:48:48.595422 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-cz9q7_openshift-multus(395610a4-58ca-497e-93a6-714bd6c111c1)\"" pod="openshift-multus/multus-cz9q7" podUID="395610a4-58ca-497e-93a6-714bd6c111c1" Dec 16 12:48:48 crc kubenswrapper[4757]: I1216 12:48:48.615522 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lst98" podStartSLOduration=94.615436774 podStartE2EDuration="1m34.615436774s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:48:28.540500292 +0000 UTC m=+93.968244088" watchObservedRunningTime="2025-12-16 12:48:48.615436774 +0000 UTC m=+114.043180570" Dec 16 12:48:48 crc kubenswrapper[4757]: I1216 12:48:48.948339 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:48 crc kubenswrapper[4757]: I1216 12:48:48.948415 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:48 crc kubenswrapper[4757]: I1216 12:48:48.948338 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:48 crc kubenswrapper[4757]: E1216 12:48:48.948541 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:48 crc kubenswrapper[4757]: I1216 12:48:48.948699 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:48 crc kubenswrapper[4757]: E1216 12:48:48.948824 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:48 crc kubenswrapper[4757]: E1216 12:48:48.948875 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:48 crc kubenswrapper[4757]: E1216 12:48:48.950111 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:49 crc kubenswrapper[4757]: I1216 12:48:49.599397 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cz9q7_395610a4-58ca-497e-93a6-714bd6c111c1/kube-multus/1.log" Dec 16 12:48:50 crc kubenswrapper[4757]: I1216 12:48:50.948180 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:50 crc kubenswrapper[4757]: E1216 12:48:50.948920 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:50 crc kubenswrapper[4757]: I1216 12:48:50.948510 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:50 crc kubenswrapper[4757]: I1216 12:48:50.948582 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:50 crc kubenswrapper[4757]: E1216 12:48:50.949373 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:50 crc kubenswrapper[4757]: I1216 12:48:50.948470 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:50 crc kubenswrapper[4757]: E1216 12:48:50.949308 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:50 crc kubenswrapper[4757]: E1216 12:48:50.949569 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:52 crc kubenswrapper[4757]: I1216 12:48:52.948701 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:52 crc kubenswrapper[4757]: I1216 12:48:52.948776 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:52 crc kubenswrapper[4757]: I1216 12:48:52.948715 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:52 crc kubenswrapper[4757]: E1216 12:48:52.948825 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:52 crc kubenswrapper[4757]: E1216 12:48:52.948888 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:52 crc kubenswrapper[4757]: E1216 12:48:52.949099 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:52 crc kubenswrapper[4757]: I1216 12:48:52.949579 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:52 crc kubenswrapper[4757]: E1216 12:48:52.949770 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:54 crc kubenswrapper[4757]: I1216 12:48:54.948271 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:54 crc kubenswrapper[4757]: I1216 12:48:54.948271 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:54 crc kubenswrapper[4757]: I1216 12:48:54.948378 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:54 crc kubenswrapper[4757]: E1216 12:48:54.949765 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:54 crc kubenswrapper[4757]: I1216 12:48:54.949785 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:54 crc kubenswrapper[4757]: E1216 12:48:54.949969 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:54 crc kubenswrapper[4757]: E1216 12:48:54.950088 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:54 crc kubenswrapper[4757]: E1216 12:48:54.949846 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:54 crc kubenswrapper[4757]: E1216 12:48:54.985447 4757 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 16 12:48:55 crc kubenswrapper[4757]: E1216 12:48:55.038097 4757 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 12:48:56 crc kubenswrapper[4757]: I1216 12:48:56.948678 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:56 crc kubenswrapper[4757]: I1216 12:48:56.948702 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:56 crc kubenswrapper[4757]: I1216 12:48:56.948744 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:56 crc kubenswrapper[4757]: I1216 12:48:56.948673 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:56 crc kubenswrapper[4757]: E1216 12:48:56.948821 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:56 crc kubenswrapper[4757]: E1216 12:48:56.948960 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:56 crc kubenswrapper[4757]: E1216 12:48:56.949041 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:56 crc kubenswrapper[4757]: E1216 12:48:56.949073 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:48:58 crc kubenswrapper[4757]: I1216 12:48:58.948918 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:48:58 crc kubenswrapper[4757]: I1216 12:48:58.949038 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:48:58 crc kubenswrapper[4757]: E1216 12:48:58.949093 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:48:58 crc kubenswrapper[4757]: E1216 12:48:58.949159 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:48:58 crc kubenswrapper[4757]: I1216 12:48:58.948956 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:48:58 crc kubenswrapper[4757]: E1216 12:48:58.949246 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:48:58 crc kubenswrapper[4757]: I1216 12:48:58.949385 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:48:58 crc kubenswrapper[4757]: E1216 12:48:58.949459 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:49:00 crc kubenswrapper[4757]: E1216 12:49:00.039120 4757 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 12:49:00 crc kubenswrapper[4757]: I1216 12:49:00.948799 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:49:00 crc kubenswrapper[4757]: I1216 12:49:00.948857 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:49:00 crc kubenswrapper[4757]: I1216 12:49:00.948859 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:49:00 crc kubenswrapper[4757]: I1216 12:49:00.948827 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:49:00 crc kubenswrapper[4757]: E1216 12:49:00.948936 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:49:00 crc kubenswrapper[4757]: E1216 12:49:00.949070 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:49:00 crc kubenswrapper[4757]: E1216 12:49:00.949244 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:49:00 crc kubenswrapper[4757]: I1216 12:49:00.949321 4757 scope.go:117] "RemoveContainer" containerID="e788f45ea79fe7471dd4c6124a4bd5c8025b259e461b847cb96dc6cd647203e3" Dec 16 12:49:00 crc kubenswrapper[4757]: E1216 12:49:00.949346 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:49:01 crc kubenswrapper[4757]: I1216 12:49:01.642178 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cz9q7_395610a4-58ca-497e-93a6-714bd6c111c1/kube-multus/1.log" Dec 16 12:49:01 crc kubenswrapper[4757]: I1216 12:49:01.642522 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cz9q7" event={"ID":"395610a4-58ca-497e-93a6-714bd6c111c1","Type":"ContainerStarted","Data":"04480320c664741eda6338e8db28abff320565456f1d04941da70ec65707aa77"} Dec 16 12:49:01 crc kubenswrapper[4757]: I1216 12:49:01.949117 4757 scope.go:117] "RemoveContainer" containerID="d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d" Dec 16 12:49:02 crc kubenswrapper[4757]: I1216 12:49:02.647111 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovnkube-controller/3.log" Dec 16 12:49:02 crc kubenswrapper[4757]: I1216 12:49:02.649166 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerStarted","Data":"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209"} Dec 16 12:49:02 crc kubenswrapper[4757]: I1216 12:49:02.650137 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:49:02 crc kubenswrapper[4757]: I1216 12:49:02.677613 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" podStartSLOduration=108.677596652 podStartE2EDuration="1m48.677596652s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:02.676618376 +0000 UTC m=+128.104362172" watchObservedRunningTime="2025-12-16 12:49:02.677596652 +0000 UTC m=+128.105340448" Dec 16 12:49:02 crc kubenswrapper[4757]: I1216 12:49:02.726640 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k6rww"] Dec 16 12:49:02 crc kubenswrapper[4757]: I1216 12:49:02.726765 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:49:02 crc kubenswrapper[4757]: E1216 12:49:02.726859 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:49:02 crc kubenswrapper[4757]: I1216 12:49:02.948726 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:49:02 crc kubenswrapper[4757]: I1216 12:49:02.948796 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:49:02 crc kubenswrapper[4757]: I1216 12:49:02.948875 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:49:02 crc kubenswrapper[4757]: E1216 12:49:02.948866 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:49:02 crc kubenswrapper[4757]: E1216 12:49:02.948979 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:49:02 crc kubenswrapper[4757]: E1216 12:49:02.949086 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:49:04 crc kubenswrapper[4757]: I1216 12:49:04.948957 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:49:04 crc kubenswrapper[4757]: E1216 12:49:04.950074 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 12:49:04 crc kubenswrapper[4757]: I1216 12:49:04.950178 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:49:04 crc kubenswrapper[4757]: I1216 12:49:04.950217 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:49:04 crc kubenswrapper[4757]: I1216 12:49:04.950337 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:49:04 crc kubenswrapper[4757]: E1216 12:49:04.950439 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k6rww" podUID="0c1b0cca-3853-4bcf-8389-2fa9c754b5e8" Dec 16 12:49:04 crc kubenswrapper[4757]: E1216 12:49:04.950331 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 12:49:04 crc kubenswrapper[4757]: E1216 12:49:04.950554 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 12:49:06 crc kubenswrapper[4757]: I1216 12:49:06.949230 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:49:06 crc kubenswrapper[4757]: I1216 12:49:06.949271 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:49:06 crc kubenswrapper[4757]: I1216 12:49:06.949360 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:49:06 crc kubenswrapper[4757]: I1216 12:49:06.949911 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:49:06 crc kubenswrapper[4757]: I1216 12:49:06.952097 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 16 12:49:06 crc kubenswrapper[4757]: I1216 12:49:06.952591 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 16 12:49:06 crc kubenswrapper[4757]: I1216 12:49:06.953332 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 16 12:49:06 crc kubenswrapper[4757]: I1216 12:49:06.954449 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 16 12:49:06 crc kubenswrapper[4757]: I1216 12:49:06.954755 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 16 12:49:06 crc kubenswrapper[4757]: I1216 12:49:06.955342 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.727640 4757 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.758263 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xwpj8"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.758695 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.761663 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hm8ks"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.762076 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.763583 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.763963 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.764057 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.764130 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.764176 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.763989 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.764324 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.765348 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.768578 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.769193 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.771199 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.772493 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz84v"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.772871 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz84v" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.773285 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wg8vk"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.773649 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wg8vk" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.774181 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.774290 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.778194 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.778695 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.779462 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.779667 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.779900 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.780209 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wzkwh"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.788698 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.789549 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.789775 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.789883 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.789949 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.790261 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.790505 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.790522 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.791904 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.792262 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.793234 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.793258 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-45kh6"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.793814 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-45kh6" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.794194 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.794454 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.794820 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.794872 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-zlc9d"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.795295 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.795609 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.796101 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-4hxkq"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.796438 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4hxkq" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.798879 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wspqp"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.799543 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-45d7q"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.800126 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.800628 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wspqp" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.822454 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.835875 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.836020 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.836231 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.836256 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.836452 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.836693 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.837251 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.837434 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.837581 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.837708 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.837800 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.837884 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.837972 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.838082 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.838086 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.838246 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.838327 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.837587 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.837797 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.838535 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.838546 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbjnc"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.838803 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.839057 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbjnc" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.839231 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.839474 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.846117 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.846204 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.846271 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.846323 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.846206 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.846704 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.846953 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.847204 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.851691 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.851861 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.852147 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.852515 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.852793 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.853135 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.852155 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.853607 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.853851 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.854133 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.854252 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xwpj8"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.854293 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.854503 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.853686 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.854735 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.854919 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.856925 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.857092 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.857204 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.857430 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.857668 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.857935 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.858172 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.858287 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.858413 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.858451 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.858589 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr4dp"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.858731 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.859202 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr4dp" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.859264 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-96tn8"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.859404 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.859550 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.859624 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-96tn8" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.872295 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.880283 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.880667 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.885324 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.885917 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.888780 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.904018 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.904152 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.904286 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.904429 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.904485 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.904887 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.906193 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qpp47"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.906680 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5pm5v"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.907291 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5pm5v" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.906691 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qpp47" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.907822 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4j9nz"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.908180 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.909345 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910323 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-console-oauth-config\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910355 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95287708-9f29-44fd-a6e8-e9ad0b1e934e-service-ca-bundle\") pod \"authentication-operator-69f744f599-hm8ks\" (UID: \"95287708-9f29-44fd-a6e8-e9ad0b1e934e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910374 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910391 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910407 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910423 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95287708-9f29-44fd-a6e8-e9ad0b1e934e-config\") pod \"authentication-operator-69f744f599-hm8ks\" (UID: \"95287708-9f29-44fd-a6e8-e9ad0b1e934e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910439 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c67f2cc7-204e-4c8f-9c93-b02372c5c296-audit-dir\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910452 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-encryption-config\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910470 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910509 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df94dff2-af59-42da-be83-0eb6c9aba353-client-ca\") pod \"controller-manager-879f6c89f-xwpj8\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910523 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-audit-policies\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910538 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqjtz\" (UniqueName: \"kubernetes.io/projected/c67f2cc7-204e-4c8f-9c93-b02372c5c296-kube-api-access-tqjtz\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910553 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fbd13c19-5caf-427a-a09a-1929b550dc04-trusted-ca\") pod \"console-operator-58897d9998-wspqp\" (UID: \"fbd13c19-5caf-427a-a09a-1929b550dc04\") " pod="openshift-console-operator/console-operator-58897d9998-wspqp" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910572 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff965e39-8bf4-40d8-b7af-702f0c47bbb4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wg8vk\" (UID: \"ff965e39-8bf4-40d8-b7af-702f0c47bbb4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wg8vk" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910586 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l9hm\" (UniqueName: \"kubernetes.io/projected/ff965e39-8bf4-40d8-b7af-702f0c47bbb4-kube-api-access-8l9hm\") pod \"machine-api-operator-5694c8668f-wg8vk\" (UID: \"ff965e39-8bf4-40d8-b7af-702f0c47bbb4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wg8vk" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910603 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66c0404f-467f-4b42-8cc6-ee0675ff8a55-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-nz84v\" (UID: \"66c0404f-467f-4b42-8cc6-ee0675ff8a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz84v" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910618 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910634 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-console-serving-cert\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910648 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f1227fe2-f17b-4427-a14d-b5f3be455bf7-auth-proxy-config\") pod \"machine-approver-56656f9798-zxdsc\" (UID: \"f1227fe2-f17b-4427-a14d-b5f3be455bf7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910669 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df94dff2-af59-42da-be83-0eb6c9aba353-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xwpj8\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910684 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lq82\" (UniqueName: \"kubernetes.io/projected/95287708-9f29-44fd-a6e8-e9ad0b1e934e-kube-api-access-9lq82\") pod \"authentication-operator-69f744f599-hm8ks\" (UID: \"95287708-9f29-44fd-a6e8-e9ad0b1e934e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910724 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q7mk\" (UniqueName: \"kubernetes.io/projected/f1227fe2-f17b-4427-a14d-b5f3be455bf7-kube-api-access-9q7mk\") pod \"machine-approver-56656f9798-zxdsc\" (UID: \"f1227fe2-f17b-4427-a14d-b5f3be455bf7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910748 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk5pv\" (UniqueName: \"kubernetes.io/projected/4daf4899-3f47-4776-b4ef-a54a340e95f5-kube-api-access-jk5pv\") pod \"route-controller-manager-6576b87f9c-v9647\" (UID: \"4daf4899-3f47-4776-b4ef-a54a340e95f5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910764 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95287708-9f29-44fd-a6e8-e9ad0b1e934e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hm8ks\" (UID: \"95287708-9f29-44fd-a6e8-e9ad0b1e934e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910780 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ff965e39-8bf4-40d8-b7af-702f0c47bbb4-images\") pod \"machine-api-operator-5694c8668f-wg8vk\" (UID: \"ff965e39-8bf4-40d8-b7af-702f0c47bbb4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wg8vk" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910799 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910814 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27nrq\" (UniqueName: \"kubernetes.io/projected/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-kube-api-access-27nrq\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910830 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910847 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910863 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910878 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910893 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18370ed0-2552-4394-ab48-5e61b770ad66-serving-cert\") pod \"openshift-config-operator-7777fb866f-45d7q\" (UID: \"18370ed0-2552-4394-ab48-5e61b770ad66\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910908 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/78aba4b4-7b55-4500-ad35-0ffa726b6f4a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-45kh6\" (UID: \"78aba4b4-7b55-4500-ad35-0ffa726b6f4a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-45kh6" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910927 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df94dff2-af59-42da-be83-0eb6c9aba353-config\") pod \"controller-manager-879f6c89f-xwpj8\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910941 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-console-config\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910958 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910972 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-audit-dir\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.910987 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl6lr\" (UniqueName: \"kubernetes.io/projected/66c0404f-467f-4b42-8cc6-ee0675ff8a55-kube-api-access-pl6lr\") pod \"openshift-apiserver-operator-796bbdcf4f-nz84v\" (UID: \"66c0404f-467f-4b42-8cc6-ee0675ff8a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz84v" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911020 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-service-ca\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911035 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95287708-9f29-44fd-a6e8-e9ad0b1e934e-serving-cert\") pod \"authentication-operator-69f744f599-hm8ks\" (UID: \"95287708-9f29-44fd-a6e8-e9ad0b1e934e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911053 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkmms\" (UniqueName: \"kubernetes.io/projected/78aba4b4-7b55-4500-ad35-0ffa726b6f4a-kube-api-access-jkmms\") pod \"cluster-samples-operator-665b6dd947-45kh6\" (UID: \"78aba4b4-7b55-4500-ad35-0ffa726b6f4a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-45kh6" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911078 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrgkh\" (UniqueName: \"kubernetes.io/projected/39a3c195-3130-4fbb-903e-1ac8ab630ced-kube-api-access-jrgkh\") pod \"downloads-7954f5f757-4hxkq\" (UID: \"39a3c195-3130-4fbb-903e-1ac8ab630ced\") " pod="openshift-console/downloads-7954f5f757-4hxkq" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911092 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911107 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df94dff2-af59-42da-be83-0eb6c9aba353-serving-cert\") pod \"controller-manager-879f6c89f-xwpj8\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911124 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4daf4899-3f47-4776-b4ef-a54a340e95f5-client-ca\") pod \"route-controller-manager-6576b87f9c-v9647\" (UID: \"4daf4899-3f47-4776-b4ef-a54a340e95f5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911140 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911154 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwnb4\" (UniqueName: \"kubernetes.io/projected/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-kube-api-access-vwnb4\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911168 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/18370ed0-2552-4394-ab48-5e61b770ad66-available-featuregates\") pod \"openshift-config-operator-7777fb866f-45d7q\" (UID: \"18370ed0-2552-4394-ab48-5e61b770ad66\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911181 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c0404f-467f-4b42-8cc6-ee0675ff8a55-config\") pod \"openshift-apiserver-operator-796bbdcf4f-nz84v\" (UID: \"66c0404f-467f-4b42-8cc6-ee0675ff8a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz84v" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911196 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff965e39-8bf4-40d8-b7af-702f0c47bbb4-config\") pod \"machine-api-operator-5694c8668f-wg8vk\" (UID: \"ff965e39-8bf4-40d8-b7af-702f0c47bbb4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wg8vk" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911211 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrswd\" (UniqueName: \"kubernetes.io/projected/18370ed0-2552-4394-ab48-5e61b770ad66-kube-api-access-lrswd\") pod \"openshift-config-operator-7777fb866f-45d7q\" (UID: \"18370ed0-2552-4394-ab48-5e61b770ad66\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911227 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f1227fe2-f17b-4427-a14d-b5f3be455bf7-machine-approver-tls\") pod \"machine-approver-56656f9798-zxdsc\" (UID: \"f1227fe2-f17b-4427-a14d-b5f3be455bf7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911242 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c9sm\" (UniqueName: \"kubernetes.io/projected/df94dff2-af59-42da-be83-0eb6c9aba353-kube-api-access-8c9sm\") pod \"controller-manager-879f6c89f-xwpj8\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911256 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-serving-cert\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911274 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbd13c19-5caf-427a-a09a-1929b550dc04-config\") pod \"console-operator-58897d9998-wspqp\" (UID: \"fbd13c19-5caf-427a-a09a-1929b550dc04\") " pod="openshift-console-operator/console-operator-58897d9998-wspqp" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911287 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-672dh\" (UniqueName: \"kubernetes.io/projected/fbd13c19-5caf-427a-a09a-1929b550dc04-kube-api-access-672dh\") pod \"console-operator-58897d9998-wspqp\" (UID: \"fbd13c19-5caf-427a-a09a-1929b550dc04\") " pod="openshift-console-operator/console-operator-58897d9998-wspqp" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911306 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-trusted-ca-bundle\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911320 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-audit-policies\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911337 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-oauth-serving-cert\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911351 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4daf4899-3f47-4776-b4ef-a54a340e95f5-serving-cert\") pod \"route-controller-manager-6576b87f9c-v9647\" (UID: \"4daf4899-3f47-4776-b4ef-a54a340e95f5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911367 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-etcd-client\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911381 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4daf4899-3f47-4776-b4ef-a54a340e95f5-config\") pod \"route-controller-manager-6576b87f9c-v9647\" (UID: \"4daf4899-3f47-4776-b4ef-a54a340e95f5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911395 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1227fe2-f17b-4427-a14d-b5f3be455bf7-config\") pod \"machine-approver-56656f9798-zxdsc\" (UID: \"f1227fe2-f17b-4427-a14d-b5f3be455bf7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.911411 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbd13c19-5caf-427a-a09a-1929b550dc04-serving-cert\") pod \"console-operator-58897d9998-wspqp\" (UID: \"fbd13c19-5caf-427a-a09a-1929b550dc04\") " pod="openshift-console-operator/console-operator-58897d9998-wspqp" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.914082 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2tgl4"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.914823 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-ggbw2"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.915084 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.915500 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.915853 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2tgl4" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.916531 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.916946 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.918784 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ws9qr"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.919264 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.919427 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.921603 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.921871 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.922187 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.922535 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.925933 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jv58r"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.926907 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.927042 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.929987 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.930859 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.931764 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xnmf5"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.932337 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.932324 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xnmf5" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.932382 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.942577 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m679c"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.943428 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m679c" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.944682 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hm8ks"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.948077 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-crbcx"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.948729 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qnb8m"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.949165 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qnb8m" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.949444 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.968859 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.979478 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.984637 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.998564 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.999466 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lkb7m"] Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.999620 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" Dec 16 12:49:08 crc kubenswrapper[4757]: I1216 12:49:08.999840 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.000872 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.034703 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lkb7m" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.035326 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.036264 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4485h"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.036686 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.037795 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.043998 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044101 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df94dff2-af59-42da-be83-0eb6c9aba353-client-ca\") pod \"controller-manager-879f6c89f-xwpj8\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044123 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-audit-policies\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044141 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqjtz\" (UniqueName: \"kubernetes.io/projected/c67f2cc7-204e-4c8f-9c93-b02372c5c296-kube-api-access-tqjtz\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044160 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fbd13c19-5caf-427a-a09a-1929b550dc04-trusted-ca\") pod \"console-operator-58897d9998-wspqp\" (UID: \"fbd13c19-5caf-427a-a09a-1929b550dc04\") " pod="openshift-console-operator/console-operator-58897d9998-wspqp" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044187 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff965e39-8bf4-40d8-b7af-702f0c47bbb4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wg8vk\" (UID: \"ff965e39-8bf4-40d8-b7af-702f0c47bbb4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wg8vk" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044211 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l9hm\" (UniqueName: \"kubernetes.io/projected/ff965e39-8bf4-40d8-b7af-702f0c47bbb4-kube-api-access-8l9hm\") pod \"machine-api-operator-5694c8668f-wg8vk\" (UID: \"ff965e39-8bf4-40d8-b7af-702f0c47bbb4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wg8vk" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044233 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66c0404f-467f-4b42-8cc6-ee0675ff8a55-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-nz84v\" (UID: \"66c0404f-467f-4b42-8cc6-ee0675ff8a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz84v" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044254 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044280 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c96934c7-7a51-4bc4-8c1b-959334813a98-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-96tn8\" (UID: \"c96934c7-7a51-4bc4-8c1b-959334813a98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-96tn8" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044302 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-console-serving-cert\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044325 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f1227fe2-f17b-4427-a14d-b5f3be455bf7-auth-proxy-config\") pod \"machine-approver-56656f9798-zxdsc\" (UID: \"f1227fe2-f17b-4427-a14d-b5f3be455bf7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044346 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm2hb\" (UniqueName: \"kubernetes.io/projected/f6bb180e-b2b2-41c7-b220-71f363518413-kube-api-access-bm2hb\") pod \"dns-operator-744455d44c-5pm5v\" (UID: \"f6bb180e-b2b2-41c7-b220-71f363518413\") " pod="openshift-dns-operator/dns-operator-744455d44c-5pm5v" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044369 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d05c7923-2a21-4ed9-84e9-9298b1b0fed9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qpp47\" (UID: \"d05c7923-2a21-4ed9-84e9-9298b1b0fed9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qpp47" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044400 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df94dff2-af59-42da-be83-0eb6c9aba353-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xwpj8\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044421 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lq82\" (UniqueName: \"kubernetes.io/projected/95287708-9f29-44fd-a6e8-e9ad0b1e934e-kube-api-access-9lq82\") pod \"authentication-operator-69f744f599-hm8ks\" (UID: \"95287708-9f29-44fd-a6e8-e9ad0b1e934e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044443 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q7mk\" (UniqueName: \"kubernetes.io/projected/f1227fe2-f17b-4427-a14d-b5f3be455bf7-kube-api-access-9q7mk\") pod \"machine-approver-56656f9798-zxdsc\" (UID: \"f1227fe2-f17b-4427-a14d-b5f3be455bf7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044465 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ff15bc2c-c190-41e3-b2a9-58656f51045d-etcd-service-ca\") pod \"etcd-operator-b45778765-4j9nz\" (UID: \"ff15bc2c-c190-41e3-b2a9-58656f51045d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044490 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/822c6393-9fec-4b87-89bd-c67cb487567a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2tgl4\" (UID: \"822c6393-9fec-4b87-89bd-c67cb487567a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2tgl4" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044502 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4485h" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044516 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk5pv\" (UniqueName: \"kubernetes.io/projected/4daf4899-3f47-4776-b4ef-a54a340e95f5-kube-api-access-jk5pv\") pod \"route-controller-manager-6576b87f9c-v9647\" (UID: \"4daf4899-3f47-4776-b4ef-a54a340e95f5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044890 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95287708-9f29-44fd-a6e8-e9ad0b1e934e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hm8ks\" (UID: \"95287708-9f29-44fd-a6e8-e9ad0b1e934e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044929 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ff965e39-8bf4-40d8-b7af-702f0c47bbb4-images\") pod \"machine-api-operator-5694c8668f-wg8vk\" (UID: \"ff965e39-8bf4-40d8-b7af-702f0c47bbb4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wg8vk" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044964 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.044989 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27nrq\" (UniqueName: \"kubernetes.io/projected/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-kube-api-access-27nrq\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045041 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d05c7923-2a21-4ed9-84e9-9298b1b0fed9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qpp47\" (UID: \"d05c7923-2a21-4ed9-84e9-9298b1b0fed9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qpp47" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045071 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045096 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045122 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045146 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045171 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0497d8a5-1e85-4989-8433-6b410d8f5427-service-ca-bundle\") pod \"router-default-5444994796-ggbw2\" (UID: \"0497d8a5-1e85-4989-8433-6b410d8f5427\") " pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045195 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18370ed0-2552-4394-ab48-5e61b770ad66-serving-cert\") pod \"openshift-config-operator-7777fb866f-45d7q\" (UID: \"18370ed0-2552-4394-ab48-5e61b770ad66\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045219 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/78aba4b4-7b55-4500-ad35-0ffa726b6f4a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-45kh6\" (UID: \"78aba4b4-7b55-4500-ad35-0ffa726b6f4a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-45kh6" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045250 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/822c6393-9fec-4b87-89bd-c67cb487567a-proxy-tls\") pod \"machine-config-controller-84d6567774-2tgl4\" (UID: \"822c6393-9fec-4b87-89bd-c67cb487567a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2tgl4" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045275 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df94dff2-af59-42da-be83-0eb6c9aba353-config\") pod \"controller-manager-879f6c89f-xwpj8\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045274 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045299 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff15bc2c-c190-41e3-b2a9-58656f51045d-etcd-client\") pod \"etcd-operator-b45778765-4j9nz\" (UID: \"ff15bc2c-c190-41e3-b2a9-58656f51045d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045321 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0497d8a5-1e85-4989-8433-6b410d8f5427-metrics-certs\") pod \"router-default-5444994796-ggbw2\" (UID: \"0497d8a5-1e85-4989-8433-6b410d8f5427\") " pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045347 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-console-config\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045368 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0096f9c3-dbd6-42e0-a8a3-00a816988de9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zbjnc\" (UID: \"0096f9c3-dbd6-42e0-a8a3-00a816988de9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbjnc" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045391 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkmpg\" (UniqueName: \"kubernetes.io/projected/c96934c7-7a51-4bc4-8c1b-959334813a98-kube-api-access-zkmpg\") pod \"kube-storage-version-migrator-operator-b67b599dd-96tn8\" (UID: \"c96934c7-7a51-4bc4-8c1b-959334813a98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-96tn8" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045412 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0497d8a5-1e85-4989-8433-6b410d8f5427-default-certificate\") pod \"router-default-5444994796-ggbw2\" (UID: \"0497d8a5-1e85-4989-8433-6b410d8f5427\") " pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045438 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045461 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-audit-dir\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045486 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl6lr\" (UniqueName: \"kubernetes.io/projected/66c0404f-467f-4b42-8cc6-ee0675ff8a55-kube-api-access-pl6lr\") pod \"openshift-apiserver-operator-796bbdcf4f-nz84v\" (UID: \"66c0404f-467f-4b42-8cc6-ee0675ff8a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz84v" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045510 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c96934c7-7a51-4bc4-8c1b-959334813a98-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-96tn8\" (UID: \"c96934c7-7a51-4bc4-8c1b-959334813a98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-96tn8" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.045541 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-service-ca\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.046644 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df94dff2-af59-42da-be83-0eb6c9aba353-client-ca\") pod \"controller-manager-879f6c89f-xwpj8\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.046690 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxvhx\" (UniqueName: \"kubernetes.io/projected/ff15bc2c-c190-41e3-b2a9-58656f51045d-kube-api-access-pxvhx\") pod \"etcd-operator-b45778765-4j9nz\" (UID: \"ff15bc2c-c190-41e3-b2a9-58656f51045d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.046717 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95287708-9f29-44fd-a6e8-e9ad0b1e934e-serving-cert\") pod \"authentication-operator-69f744f599-hm8ks\" (UID: \"95287708-9f29-44fd-a6e8-e9ad0b1e934e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.046741 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkmms\" (UniqueName: \"kubernetes.io/projected/78aba4b4-7b55-4500-ad35-0ffa726b6f4a-kube-api-access-jkmms\") pod \"cluster-samples-operator-665b6dd947-45kh6\" (UID: \"78aba4b4-7b55-4500-ad35-0ffa726b6f4a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-45kh6" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.046779 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrgkh\" (UniqueName: \"kubernetes.io/projected/39a3c195-3130-4fbb-903e-1ac8ab630ced-kube-api-access-jrgkh\") pod \"downloads-7954f5f757-4hxkq\" (UID: \"39a3c195-3130-4fbb-903e-1ac8ab630ced\") " pod="openshift-console/downloads-7954f5f757-4hxkq" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.046811 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.046836 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cf6bf9c1-5d43-4ab3-a38f-d96308345ff4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qr4dp\" (UID: \"cf6bf9c1-5d43-4ab3-a38f-d96308345ff4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr4dp" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.046860 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df94dff2-af59-42da-be83-0eb6c9aba353-serving-cert\") pod \"controller-manager-879f6c89f-xwpj8\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.046883 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nrnq\" (UniqueName: \"kubernetes.io/projected/822c6393-9fec-4b87-89bd-c67cb487567a-kube-api-access-7nrnq\") pod \"machine-config-controller-84d6567774-2tgl4\" (UID: \"822c6393-9fec-4b87-89bd-c67cb487567a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2tgl4" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.046907 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0497d8a5-1e85-4989-8433-6b410d8f5427-stats-auth\") pod \"router-default-5444994796-ggbw2\" (UID: \"0497d8a5-1e85-4989-8433-6b410d8f5427\") " pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.046934 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4daf4899-3f47-4776-b4ef-a54a340e95f5-client-ca\") pod \"route-controller-manager-6576b87f9c-v9647\" (UID: \"4daf4899-3f47-4776-b4ef-a54a340e95f5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.046954 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z72dn\" (UniqueName: \"kubernetes.io/projected/0096f9c3-dbd6-42e0-a8a3-00a816988de9-kube-api-access-z72dn\") pod \"openshift-controller-manager-operator-756b6f6bc6-zbjnc\" (UID: \"0096f9c3-dbd6-42e0-a8a3-00a816988de9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbjnc" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.046976 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff15bc2c-c190-41e3-b2a9-58656f51045d-config\") pod \"etcd-operator-b45778765-4j9nz\" (UID: \"ff15bc2c-c190-41e3-b2a9-58656f51045d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047001 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047043 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff15bc2c-c190-41e3-b2a9-58656f51045d-serving-cert\") pod \"etcd-operator-b45778765-4j9nz\" (UID: \"ff15bc2c-c190-41e3-b2a9-58656f51045d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047067 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwnb4\" (UniqueName: \"kubernetes.io/projected/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-kube-api-access-vwnb4\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047086 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/18370ed0-2552-4394-ab48-5e61b770ad66-available-featuregates\") pod \"openshift-config-operator-7777fb866f-45d7q\" (UID: \"18370ed0-2552-4394-ab48-5e61b770ad66\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047107 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c0404f-467f-4b42-8cc6-ee0675ff8a55-config\") pod \"openshift-apiserver-operator-796bbdcf4f-nz84v\" (UID: \"66c0404f-467f-4b42-8cc6-ee0675ff8a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz84v" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047314 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-m8kth"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047392 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff965e39-8bf4-40d8-b7af-702f0c47bbb4-config\") pod \"machine-api-operator-5694c8668f-wg8vk\" (UID: \"ff965e39-8bf4-40d8-b7af-702f0c47bbb4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wg8vk" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047427 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrswd\" (UniqueName: \"kubernetes.io/projected/18370ed0-2552-4394-ab48-5e61b770ad66-kube-api-access-lrswd\") pod \"openshift-config-operator-7777fb866f-45d7q\" (UID: \"18370ed0-2552-4394-ab48-5e61b770ad66\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047451 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f1227fe2-f17b-4427-a14d-b5f3be455bf7-machine-approver-tls\") pod \"machine-approver-56656f9798-zxdsc\" (UID: \"f1227fe2-f17b-4427-a14d-b5f3be455bf7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047479 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c9sm\" (UniqueName: \"kubernetes.io/projected/df94dff2-af59-42da-be83-0eb6c9aba353-kube-api-access-8c9sm\") pod \"controller-manager-879f6c89f-xwpj8\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047500 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-serving-cert\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047529 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbd13c19-5caf-427a-a09a-1929b550dc04-config\") pod \"console-operator-58897d9998-wspqp\" (UID: \"fbd13c19-5caf-427a-a09a-1929b550dc04\") " pod="openshift-console-operator/console-operator-58897d9998-wspqp" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047553 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-672dh\" (UniqueName: \"kubernetes.io/projected/fbd13c19-5caf-427a-a09a-1929b550dc04-kube-api-access-672dh\") pod \"console-operator-58897d9998-wspqp\" (UID: \"fbd13c19-5caf-427a-a09a-1929b550dc04\") " pod="openshift-console-operator/console-operator-58897d9998-wspqp" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047578 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-trusted-ca-bundle\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047601 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-audit-policies\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047625 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-oauth-serving-cert\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047648 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4daf4899-3f47-4776-b4ef-a54a340e95f5-serving-cert\") pod \"route-controller-manager-6576b87f9c-v9647\" (UID: \"4daf4899-3f47-4776-b4ef-a54a340e95f5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047673 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-etcd-client\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047699 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkg6b\" (UniqueName: \"kubernetes.io/projected/cf6bf9c1-5d43-4ab3-a38f-d96308345ff4-kube-api-access-pkg6b\") pod \"control-plane-machine-set-operator-78cbb6b69f-qr4dp\" (UID: \"cf6bf9c1-5d43-4ab3-a38f-d96308345ff4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr4dp" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047723 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4daf4899-3f47-4776-b4ef-a54a340e95f5-config\") pod \"route-controller-manager-6576b87f9c-v9647\" (UID: \"4daf4899-3f47-4776-b4ef-a54a340e95f5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047747 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1227fe2-f17b-4427-a14d-b5f3be455bf7-config\") pod \"machine-approver-56656f9798-zxdsc\" (UID: \"f1227fe2-f17b-4427-a14d-b5f3be455bf7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047768 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6bb180e-b2b2-41c7-b220-71f363518413-metrics-tls\") pod \"dns-operator-744455d44c-5pm5v\" (UID: \"f6bb180e-b2b2-41c7-b220-71f363518413\") " pod="openshift-dns-operator/dns-operator-744455d44c-5pm5v" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047790 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbd13c19-5caf-427a-a09a-1929b550dc04-serving-cert\") pod \"console-operator-58897d9998-wspqp\" (UID: \"fbd13c19-5caf-427a-a09a-1929b550dc04\") " pod="openshift-console-operator/console-operator-58897d9998-wspqp" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047814 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d05c7923-2a21-4ed9-84e9-9298b1b0fed9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qpp47\" (UID: \"d05c7923-2a21-4ed9-84e9-9298b1b0fed9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qpp47" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047839 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twqh7\" (UniqueName: \"kubernetes.io/projected/0497d8a5-1e85-4989-8433-6b410d8f5427-kube-api-access-twqh7\") pod \"router-default-5444994796-ggbw2\" (UID: \"0497d8a5-1e85-4989-8433-6b410d8f5427\") " pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047865 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-console-oauth-config\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.047888 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95287708-9f29-44fd-a6e8-e9ad0b1e934e-service-ca-bundle\") pod \"authentication-operator-69f744f599-hm8ks\" (UID: \"95287708-9f29-44fd-a6e8-e9ad0b1e934e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.048526 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vlrtg"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.049039 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.049686 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.050029 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-m8kth" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.050280 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vlrtg" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.050810 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-57pwb"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.057687 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9g4nt"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.058639 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.052243 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c0404f-467f-4b42-8cc6-ee0675ff8a55-config\") pod \"openshift-apiserver-operator-796bbdcf4f-nz84v\" (UID: \"66c0404f-467f-4b42-8cc6-ee0675ff8a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz84v" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.052648 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-audit-policies\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.051323 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0096f9c3-dbd6-42e0-a8a3-00a816988de9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zbjnc\" (UID: \"0096f9c3-dbd6-42e0-a8a3-00a816988de9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbjnc" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.058966 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.059080 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.059167 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.059248 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95287708-9f29-44fd-a6e8-e9ad0b1e934e-config\") pod \"authentication-operator-69f744f599-hm8ks\" (UID: \"95287708-9f29-44fd-a6e8-e9ad0b1e934e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.059328 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c67f2cc7-204e-4c8f-9c93-b02372c5c296-audit-dir\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.059404 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-encryption-config\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.059478 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ff15bc2c-c190-41e3-b2a9-58656f51045d-etcd-ca\") pod \"etcd-operator-b45778765-4j9nz\" (UID: \"ff15bc2c-c190-41e3-b2a9-58656f51045d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.059683 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-57pwb" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.051800 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95287708-9f29-44fd-a6e8-e9ad0b1e934e-service-ca-bundle\") pod \"authentication-operator-69f744f599-hm8ks\" (UID: \"95287708-9f29-44fd-a6e8-e9ad0b1e934e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.060088 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.060628 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fbd13c19-5caf-427a-a09a-1929b550dc04-trusted-ca\") pod \"console-operator-58897d9998-wspqp\" (UID: \"fbd13c19-5caf-427a-a09a-1929b550dc04\") " pod="openshift-console-operator/console-operator-58897d9998-wspqp" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.062025 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-service-ca\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.063085 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz84v"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.063130 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xnmf5"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.063141 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-45d7q"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.064070 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-trusted-ca-bundle\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.064448 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.064970 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95287708-9f29-44fd-a6e8-e9ad0b1e934e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hm8ks\" (UID: \"95287708-9f29-44fd-a6e8-e9ad0b1e934e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.065514 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ff965e39-8bf4-40d8-b7af-702f0c47bbb4-images\") pod \"machine-api-operator-5694c8668f-wg8vk\" (UID: \"ff965e39-8bf4-40d8-b7af-702f0c47bbb4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wg8vk" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.066365 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df94dff2-af59-42da-be83-0eb6c9aba353-config\") pod \"controller-manager-879f6c89f-xwpj8\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.066441 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wg8vk"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.066474 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr4dp"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.066484 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wspqp"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.066752 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.067329 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-console-config\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.069728 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.069767 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qpp47"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.069778 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vdk49"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.070434 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vdk49" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.072160 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f1227fe2-f17b-4427-a14d-b5f3be455bf7-auth-proxy-config\") pod \"machine-approver-56656f9798-zxdsc\" (UID: \"f1227fe2-f17b-4427-a14d-b5f3be455bf7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.073152 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4daf4899-3f47-4776-b4ef-a54a340e95f5-config\") pod \"route-controller-manager-6576b87f9c-v9647\" (UID: \"4daf4899-3f47-4776-b4ef-a54a340e95f5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.073876 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-crbcx"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.073903 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4hxkq"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.073913 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.077948 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.078809 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.079383 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95287708-9f29-44fd-a6e8-e9ad0b1e934e-config\") pod \"authentication-operator-69f744f599-hm8ks\" (UID: \"95287708-9f29-44fd-a6e8-e9ad0b1e934e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.079500 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c67f2cc7-204e-4c8f-9c93-b02372c5c296-audit-dir\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.080419 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/18370ed0-2552-4394-ab48-5e61b770ad66-available-featuregates\") pod \"openshift-config-operator-7777fb866f-45d7q\" (UID: \"18370ed0-2552-4394-ab48-5e61b770ad66\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.080943 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff965e39-8bf4-40d8-b7af-702f0c47bbb4-config\") pod \"machine-api-operator-5694c8668f-wg8vk\" (UID: \"ff965e39-8bf4-40d8-b7af-702f0c47bbb4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wg8vk" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.083363 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f1227fe2-f17b-4427-a14d-b5f3be455bf7-machine-approver-tls\") pod \"machine-approver-56656f9798-zxdsc\" (UID: \"f1227fe2-f17b-4427-a14d-b5f3be455bf7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.083853 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4daf4899-3f47-4776-b4ef-a54a340e95f5-serving-cert\") pod \"route-controller-manager-6576b87f9c-v9647\" (UID: \"4daf4899-3f47-4776-b4ef-a54a340e95f5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.084582 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.084978 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-encryption-config\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.086549 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-etcd-client\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.086786 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/78aba4b4-7b55-4500-ad35-0ffa726b6f4a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-45kh6\" (UID: \"78aba4b4-7b55-4500-ad35-0ffa726b6f4a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-45kh6" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.087398 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.088562 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df94dff2-af59-42da-be83-0eb6c9aba353-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xwpj8\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.088863 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-audit-dir\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.094473 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1227fe2-f17b-4427-a14d-b5f3be455bf7-config\") pod \"machine-approver-56656f9798-zxdsc\" (UID: \"f1227fe2-f17b-4427-a14d-b5f3be455bf7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.094613 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.094842 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-console-oauth-config\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.094867 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.095132 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.095296 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.095555 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18370ed0-2552-4394-ab48-5e61b770ad66-serving-cert\") pod \"openshift-config-operator-7777fb866f-45d7q\" (UID: \"18370ed0-2552-4394-ab48-5e61b770ad66\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.095928 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff965e39-8bf4-40d8-b7af-702f0c47bbb4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wg8vk\" (UID: \"ff965e39-8bf4-40d8-b7af-702f0c47bbb4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wg8vk" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.096200 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66c0404f-467f-4b42-8cc6-ee0675ff8a55-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-nz84v\" (UID: \"66c0404f-467f-4b42-8cc6-ee0675ff8a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz84v" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.097331 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qnb8m"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.097479 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2tgl4"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.099937 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-audit-policies\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.099996 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zlc9d"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.100448 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-96tn8"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.100526 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4daf4899-3f47-4776-b4ef-a54a340e95f5-client-ca\") pod \"route-controller-manager-6576b87f9c-v9647\" (UID: \"4daf4899-3f47-4776-b4ef-a54a340e95f5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.101377 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-oauth-serving-cert\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.101646 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.102208 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.103313 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jv58r"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.104108 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-45kh6"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.106987 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m679c"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.106538 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbd13c19-5caf-427a-a09a-1929b550dc04-config\") pod \"console-operator-58897d9998-wspqp\" (UID: \"fbd13c19-5caf-427a-a09a-1929b550dc04\") " pod="openshift-console-operator/console-operator-58897d9998-wspqp" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.106334 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.106429 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.107888 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95287708-9f29-44fd-a6e8-e9ad0b1e934e-serving-cert\") pod \"authentication-operator-69f744f599-hm8ks\" (UID: \"95287708-9f29-44fd-a6e8-e9ad0b1e934e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.107940 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5pm5v"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.114121 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4j9nz"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.114320 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbjnc"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.114405 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-57pwb"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.114731 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df94dff2-af59-42da-be83-0eb6c9aba353-serving-cert\") pod \"controller-manager-879f6c89f-xwpj8\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.115519 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-serving-cert\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.116268 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.116612 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-console-serving-cert\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.116666 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.119922 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbd13c19-5caf-427a-a09a-1929b550dc04-serving-cert\") pod \"console-operator-58897d9998-wspqp\" (UID: \"fbd13c19-5caf-427a-a09a-1929b550dc04\") " pod="openshift-console-operator/console-operator-58897d9998-wspqp" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.121397 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.123364 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.123946 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wzkwh"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.126673 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.130321 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4485h"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.130363 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.130374 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-m8kth"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.139498 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-m8zkr"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.140311 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-m8zkr" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.142460 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.142605 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vdk49"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.142665 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.144260 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ws9qr"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.148868 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.161565 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c96934c7-7a51-4bc4-8c1b-959334813a98-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-96tn8\" (UID: \"c96934c7-7a51-4bc4-8c1b-959334813a98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-96tn8" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.161901 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm2hb\" (UniqueName: \"kubernetes.io/projected/f6bb180e-b2b2-41c7-b220-71f363518413-kube-api-access-bm2hb\") pod \"dns-operator-744455d44c-5pm5v\" (UID: \"f6bb180e-b2b2-41c7-b220-71f363518413\") " pod="openshift-dns-operator/dns-operator-744455d44c-5pm5v" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.162069 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d05c7923-2a21-4ed9-84e9-9298b1b0fed9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qpp47\" (UID: \"d05c7923-2a21-4ed9-84e9-9298b1b0fed9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qpp47" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.162477 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.162639 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ff15bc2c-c190-41e3-b2a9-58656f51045d-etcd-service-ca\") pod \"etcd-operator-b45778765-4j9nz\" (UID: \"ff15bc2c-c190-41e3-b2a9-58656f51045d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.162751 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/822c6393-9fec-4b87-89bd-c67cb487567a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2tgl4\" (UID: \"822c6393-9fec-4b87-89bd-c67cb487567a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2tgl4" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.162861 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d05c7923-2a21-4ed9-84e9-9298b1b0fed9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qpp47\" (UID: \"d05c7923-2a21-4ed9-84e9-9298b1b0fed9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qpp47" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.163044 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0497d8a5-1e85-4989-8433-6b410d8f5427-service-ca-bundle\") pod \"router-default-5444994796-ggbw2\" (UID: \"0497d8a5-1e85-4989-8433-6b410d8f5427\") " pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.163230 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/822c6393-9fec-4b87-89bd-c67cb487567a-proxy-tls\") pod \"machine-config-controller-84d6567774-2tgl4\" (UID: \"822c6393-9fec-4b87-89bd-c67cb487567a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2tgl4" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.163342 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff15bc2c-c190-41e3-b2a9-58656f51045d-etcd-client\") pod \"etcd-operator-b45778765-4j9nz\" (UID: \"ff15bc2c-c190-41e3-b2a9-58656f51045d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.163444 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0497d8a5-1e85-4989-8433-6b410d8f5427-metrics-certs\") pod \"router-default-5444994796-ggbw2\" (UID: \"0497d8a5-1e85-4989-8433-6b410d8f5427\") " pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.163690 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0096f9c3-dbd6-42e0-a8a3-00a816988de9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zbjnc\" (UID: \"0096f9c3-dbd6-42e0-a8a3-00a816988de9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbjnc" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.163862 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkmpg\" (UniqueName: \"kubernetes.io/projected/c96934c7-7a51-4bc4-8c1b-959334813a98-kube-api-access-zkmpg\") pod \"kube-storage-version-migrator-operator-b67b599dd-96tn8\" (UID: \"c96934c7-7a51-4bc4-8c1b-959334813a98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-96tn8" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.164075 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0497d8a5-1e85-4989-8433-6b410d8f5427-default-certificate\") pod \"router-default-5444994796-ggbw2\" (UID: \"0497d8a5-1e85-4989-8433-6b410d8f5427\") " pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.164256 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c96934c7-7a51-4bc4-8c1b-959334813a98-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-96tn8\" (UID: \"c96934c7-7a51-4bc4-8c1b-959334813a98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-96tn8" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.164522 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxvhx\" (UniqueName: \"kubernetes.io/projected/ff15bc2c-c190-41e3-b2a9-58656f51045d-kube-api-access-pxvhx\") pod \"etcd-operator-b45778765-4j9nz\" (UID: \"ff15bc2c-c190-41e3-b2a9-58656f51045d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.164634 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cf6bf9c1-5d43-4ab3-a38f-d96308345ff4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qr4dp\" (UID: \"cf6bf9c1-5d43-4ab3-a38f-d96308345ff4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr4dp" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.164715 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nrnq\" (UniqueName: \"kubernetes.io/projected/822c6393-9fec-4b87-89bd-c67cb487567a-kube-api-access-7nrnq\") pod \"machine-config-controller-84d6567774-2tgl4\" (UID: \"822c6393-9fec-4b87-89bd-c67cb487567a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2tgl4" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.164809 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0497d8a5-1e85-4989-8433-6b410d8f5427-stats-auth\") pod \"router-default-5444994796-ggbw2\" (UID: \"0497d8a5-1e85-4989-8433-6b410d8f5427\") " pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.164925 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z72dn\" (UniqueName: \"kubernetes.io/projected/0096f9c3-dbd6-42e0-a8a3-00a816988de9-kube-api-access-z72dn\") pod \"openshift-controller-manager-operator-756b6f6bc6-zbjnc\" (UID: \"0096f9c3-dbd6-42e0-a8a3-00a816988de9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbjnc" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.164285 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d05c7923-2a21-4ed9-84e9-9298b1b0fed9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qpp47\" (UID: \"d05c7923-2a21-4ed9-84e9-9298b1b0fed9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qpp47" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.165395 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ff15bc2c-c190-41e3-b2a9-58656f51045d-etcd-service-ca\") pod \"etcd-operator-b45778765-4j9nz\" (UID: \"ff15bc2c-c190-41e3-b2a9-58656f51045d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.164662 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/822c6393-9fec-4b87-89bd-c67cb487567a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2tgl4\" (UID: \"822c6393-9fec-4b87-89bd-c67cb487567a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2tgl4" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.166169 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0096f9c3-dbd6-42e0-a8a3-00a816988de9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zbjnc\" (UID: \"0096f9c3-dbd6-42e0-a8a3-00a816988de9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbjnc" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.166605 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff15bc2c-c190-41e3-b2a9-58656f51045d-config\") pod \"etcd-operator-b45778765-4j9nz\" (UID: \"ff15bc2c-c190-41e3-b2a9-58656f51045d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.166729 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff15bc2c-c190-41e3-b2a9-58656f51045d-serving-cert\") pod \"etcd-operator-b45778765-4j9nz\" (UID: \"ff15bc2c-c190-41e3-b2a9-58656f51045d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.166892 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c96934c7-7a51-4bc4-8c1b-959334813a98-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-96tn8\" (UID: \"c96934c7-7a51-4bc4-8c1b-959334813a98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-96tn8" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.167081 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d05c7923-2a21-4ed9-84e9-9298b1b0fed9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qpp47\" (UID: \"d05c7923-2a21-4ed9-84e9-9298b1b0fed9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qpp47" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.167297 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff15bc2c-c190-41e3-b2a9-58656f51045d-config\") pod \"etcd-operator-b45778765-4j9nz\" (UID: \"ff15bc2c-c190-41e3-b2a9-58656f51045d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.167532 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkg6b\" (UniqueName: \"kubernetes.io/projected/cf6bf9c1-5d43-4ab3-a38f-d96308345ff4-kube-api-access-pkg6b\") pod \"control-plane-machine-set-operator-78cbb6b69f-qr4dp\" (UID: \"cf6bf9c1-5d43-4ab3-a38f-d96308345ff4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr4dp" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.167581 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6bb180e-b2b2-41c7-b220-71f363518413-metrics-tls\") pod \"dns-operator-744455d44c-5pm5v\" (UID: \"f6bb180e-b2b2-41c7-b220-71f363518413\") " pod="openshift-dns-operator/dns-operator-744455d44c-5pm5v" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.167610 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d05c7923-2a21-4ed9-84e9-9298b1b0fed9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qpp47\" (UID: \"d05c7923-2a21-4ed9-84e9-9298b1b0fed9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qpp47" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.167634 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twqh7\" (UniqueName: \"kubernetes.io/projected/0497d8a5-1e85-4989-8433-6b410d8f5427-kube-api-access-twqh7\") pod \"router-default-5444994796-ggbw2\" (UID: \"0497d8a5-1e85-4989-8433-6b410d8f5427\") " pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.167660 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0096f9c3-dbd6-42e0-a8a3-00a816988de9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zbjnc\" (UID: \"0096f9c3-dbd6-42e0-a8a3-00a816988de9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbjnc" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.167686 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ff15bc2c-c190-41e3-b2a9-58656f51045d-etcd-ca\") pod \"etcd-operator-b45778765-4j9nz\" (UID: \"ff15bc2c-c190-41e3-b2a9-58656f51045d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.168317 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ff15bc2c-c190-41e3-b2a9-58656f51045d-etcd-ca\") pod \"etcd-operator-b45778765-4j9nz\" (UID: \"ff15bc2c-c190-41e3-b2a9-58656f51045d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.171188 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c96934c7-7a51-4bc4-8c1b-959334813a98-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-96tn8\" (UID: \"c96934c7-7a51-4bc4-8c1b-959334813a98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-96tn8" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.172105 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vlrtg"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.172794 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ff15bc2c-c190-41e3-b2a9-58656f51045d-etcd-client\") pod \"etcd-operator-b45778765-4j9nz\" (UID: \"ff15bc2c-c190-41e3-b2a9-58656f51045d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.175502 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0096f9c3-dbd6-42e0-a8a3-00a816988de9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zbjnc\" (UID: \"0096f9c3-dbd6-42e0-a8a3-00a816988de9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbjnc" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.176084 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lkb7m"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.177375 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cf6bf9c1-5d43-4ab3-a38f-d96308345ff4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qr4dp\" (UID: \"cf6bf9c1-5d43-4ab3-a38f-d96308345ff4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr4dp" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.177462 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff15bc2c-c190-41e3-b2a9-58656f51045d-serving-cert\") pod \"etcd-operator-b45778765-4j9nz\" (UID: \"ff15bc2c-c190-41e3-b2a9-58656f51045d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.178133 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6bb180e-b2b2-41c7-b220-71f363518413-metrics-tls\") pod \"dns-operator-744455d44c-5pm5v\" (UID: \"f6bb180e-b2b2-41c7-b220-71f363518413\") " pod="openshift-dns-operator/dns-operator-744455d44c-5pm5v" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.178248 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9g4nt"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.182514 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.184399 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.185496 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d"] Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.201205 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.208591 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/822c6393-9fec-4b87-89bd-c67cb487567a-proxy-tls\") pod \"machine-config-controller-84d6567774-2tgl4\" (UID: \"822c6393-9fec-4b87-89bd-c67cb487567a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2tgl4" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.220538 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.240599 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.249096 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0497d8a5-1e85-4989-8433-6b410d8f5427-stats-auth\") pod \"router-default-5444994796-ggbw2\" (UID: \"0497d8a5-1e85-4989-8433-6b410d8f5427\") " pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.260965 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.265142 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0497d8a5-1e85-4989-8433-6b410d8f5427-service-ca-bundle\") pod \"router-default-5444994796-ggbw2\" (UID: \"0497d8a5-1e85-4989-8433-6b410d8f5427\") " pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.280977 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.287093 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0497d8a5-1e85-4989-8433-6b410d8f5427-metrics-certs\") pod \"router-default-5444994796-ggbw2\" (UID: \"0497d8a5-1e85-4989-8433-6b410d8f5427\") " pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.302083 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.319951 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.340051 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.349797 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0497d8a5-1e85-4989-8433-6b410d8f5427-default-certificate\") pod \"router-default-5444994796-ggbw2\" (UID: \"0497d8a5-1e85-4989-8433-6b410d8f5427\") " pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.365881 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.381100 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.401711 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.421243 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.481632 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.501116 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.521782 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.541118 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.560425 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.586117 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.600736 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.620660 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.640841 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.660958 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.681292 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.701067 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.721714 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.741668 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.761295 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.781050 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.801123 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.821314 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.841675 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.860761 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.881220 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.901870 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.920985 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.941289 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.959392 4757 request.go:700] Waited for 1.01550462s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver-operator/secrets?fieldSelector=metadata.name%3Dkube-apiserver-operator-serving-cert&limit=500&resourceVersion=0 Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.961155 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 16 12:49:09 crc kubenswrapper[4757]: I1216 12:49:09.981370 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.001457 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.021168 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.041983 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.062181 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.081400 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.101701 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.121409 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.140396 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.160405 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.191316 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.200820 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.221034 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.240770 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.260524 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.280482 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.316417 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk5pv\" (UniqueName: \"kubernetes.io/projected/4daf4899-3f47-4776-b4ef-a54a340e95f5-kube-api-access-jk5pv\") pod \"route-controller-manager-6576b87f9c-v9647\" (UID: \"4daf4899-3f47-4776-b4ef-a54a340e95f5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.321556 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.333154 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.340587 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.360721 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.381888 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.401671 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.421257 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.441013 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.461694 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.481669 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.500398 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.506322 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647"] Dec 16 12:49:10 crc kubenswrapper[4757]: W1216 12:49:10.513420 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4daf4899_3f47_4776_b4ef_a54a340e95f5.slice/crio-a3a484cc25575431e7f37134e059931086f2dca5c037615e17754e3cc1b1cea4 WatchSource:0}: Error finding container a3a484cc25575431e7f37134e059931086f2dca5c037615e17754e3cc1b1cea4: Status 404 returned error can't find the container with id a3a484cc25575431e7f37134e059931086f2dca5c037615e17754e3cc1b1cea4 Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.520409 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.541669 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.561309 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.596991 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqjtz\" (UniqueName: \"kubernetes.io/projected/c67f2cc7-204e-4c8f-9c93-b02372c5c296-kube-api-access-tqjtz\") pod \"oauth-openshift-558db77b4-wzkwh\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.601539 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.634795 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l9hm\" (UniqueName: \"kubernetes.io/projected/ff965e39-8bf4-40d8-b7af-702f0c47bbb4-kube-api-access-8l9hm\") pod \"machine-api-operator-5694c8668f-wg8vk\" (UID: \"ff965e39-8bf4-40d8-b7af-702f0c47bbb4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wg8vk" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.643154 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.651993 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.673937 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" event={"ID":"4daf4899-3f47-4776-b4ef-a54a340e95f5","Type":"ContainerStarted","Data":"a3a484cc25575431e7f37134e059931086f2dca5c037615e17754e3cc1b1cea4"} Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.674884 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27nrq\" (UniqueName: \"kubernetes.io/projected/3c2a0527-13e8-48c7-aa7d-992f5c9ca223-kube-api-access-27nrq\") pod \"apiserver-7bbb656c7d-wwkqf\" (UID: \"3c2a0527-13e8-48c7-aa7d-992f5c9ca223\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.681032 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.701171 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.722118 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.757139 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwnb4\" (UniqueName: \"kubernetes.io/projected/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-kube-api-access-vwnb4\") pod \"console-f9d7485db-zlc9d\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.777445 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrswd\" (UniqueName: \"kubernetes.io/projected/18370ed0-2552-4394-ab48-5e61b770ad66-kube-api-access-lrswd\") pod \"openshift-config-operator-7777fb866f-45d7q\" (UID: \"18370ed0-2552-4394-ab48-5e61b770ad66\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.799877 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wzkwh"] Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.802652 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 16 12:49:10 crc kubenswrapper[4757]: W1216 12:49:10.805901 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc67f2cc7_204e_4c8f_9c93_b02372c5c296.slice/crio-44e202b6b4fbaaabd0df44ec8e012d255a59b2b72d7d2018e903c5edc6667afd WatchSource:0}: Error finding container 44e202b6b4fbaaabd0df44ec8e012d255a59b2b72d7d2018e903c5edc6667afd: Status 404 returned error can't find the container with id 44e202b6b4fbaaabd0df44ec8e012d255a59b2b72d7d2018e903c5edc6667afd Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.807229 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c9sm\" (UniqueName: \"kubernetes.io/projected/df94dff2-af59-42da-be83-0eb6c9aba353-kube-api-access-8c9sm\") pod \"controller-manager-879f6c89f-xwpj8\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.821623 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.841649 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.860434 4757 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.880804 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.890295 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.923441 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wg8vk" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.925540 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lq82\" (UniqueName: \"kubernetes.io/projected/95287708-9f29-44fd-a6e8-e9ad0b1e934e-kube-api-access-9lq82\") pod \"authentication-operator-69f744f599-hm8ks\" (UID: \"95287708-9f29-44fd-a6e8-e9ad0b1e934e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.957437 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q7mk\" (UniqueName: \"kubernetes.io/projected/f1227fe2-f17b-4427-a14d-b5f3be455bf7-kube-api-access-9q7mk\") pod \"machine-approver-56656f9798-zxdsc\" (UID: \"f1227fe2-f17b-4427-a14d-b5f3be455bf7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.961192 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.973987 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkmms\" (UniqueName: \"kubernetes.io/projected/78aba4b4-7b55-4500-ad35-0ffa726b6f4a-kube-api-access-jkmms\") pod \"cluster-samples-operator-665b6dd947-45kh6\" (UID: \"78aba4b4-7b55-4500-ad35-0ffa726b6f4a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-45kh6" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.979797 4757 request.go:700] Waited for 1.89019059s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/serviceaccounts/openshift-apiserver-operator/token Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.984148 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.993290 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl6lr\" (UniqueName: \"kubernetes.io/projected/66c0404f-467f-4b42-8cc6-ee0675ff8a55-kube-api-access-pl6lr\") pod \"openshift-apiserver-operator-796bbdcf4f-nz84v\" (UID: \"66c0404f-467f-4b42-8cc6-ee0675ff8a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz84v" Dec 16 12:49:10 crc kubenswrapper[4757]: I1216 12:49:10.995754 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-672dh\" (UniqueName: \"kubernetes.io/projected/fbd13c19-5caf-427a-a09a-1929b550dc04-kube-api-access-672dh\") pod \"console-operator-58897d9998-wspqp\" (UID: \"fbd13c19-5caf-427a-a09a-1929b550dc04\") " pod="openshift-console-operator/console-operator-58897d9998-wspqp" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.016478 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrgkh\" (UniqueName: \"kubernetes.io/projected/39a3c195-3130-4fbb-903e-1ac8ab630ced-kube-api-access-jrgkh\") pod \"downloads-7954f5f757-4hxkq\" (UID: \"39a3c195-3130-4fbb-903e-1ac8ab630ced\") " pod="openshift-console/downloads-7954f5f757-4hxkq" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.021527 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.022683 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.041117 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4hxkq" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.042851 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.060722 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.077550 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.092828 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wspqp" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.096883 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm2hb\" (UniqueName: \"kubernetes.io/projected/f6bb180e-b2b2-41c7-b220-71f363518413-kube-api-access-bm2hb\") pod \"dns-operator-744455d44c-5pm5v\" (UID: \"f6bb180e-b2b2-41c7-b220-71f363518413\") " pod="openshift-dns-operator/dns-operator-744455d44c-5pm5v" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.116945 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z72dn\" (UniqueName: \"kubernetes.io/projected/0096f9c3-dbd6-42e0-a8a3-00a816988de9-kube-api-access-z72dn\") pod \"openshift-controller-manager-operator-756b6f6bc6-zbjnc\" (UID: \"0096f9c3-dbd6-42e0-a8a3-00a816988de9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbjnc" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.137451 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxvhx\" (UniqueName: \"kubernetes.io/projected/ff15bc2c-c190-41e3-b2a9-58656f51045d-kube-api-access-pxvhx\") pod \"etcd-operator-b45778765-4j9nz\" (UID: \"ff15bc2c-c190-41e3-b2a9-58656f51045d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.157271 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkmpg\" (UniqueName: \"kubernetes.io/projected/c96934c7-7a51-4bc4-8c1b-959334813a98-kube-api-access-zkmpg\") pod \"kube-storage-version-migrator-operator-b67b599dd-96tn8\" (UID: \"c96934c7-7a51-4bc4-8c1b-959334813a98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-96tn8" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.175401 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nrnq\" (UniqueName: \"kubernetes.io/projected/822c6393-9fec-4b87-89bd-c67cb487567a-kube-api-access-7nrnq\") pod \"machine-config-controller-84d6567774-2tgl4\" (UID: \"822c6393-9fec-4b87-89bd-c67cb487567a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2tgl4" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.197935 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d05c7923-2a21-4ed9-84e9-9298b1b0fed9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qpp47\" (UID: \"d05c7923-2a21-4ed9-84e9-9298b1b0fed9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qpp47" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.204057 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.213375 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz84v" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.215666 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twqh7\" (UniqueName: \"kubernetes.io/projected/0497d8a5-1e85-4989-8433-6b410d8f5427-kube-api-access-twqh7\") pod \"router-default-5444994796-ggbw2\" (UID: \"0497d8a5-1e85-4989-8433-6b410d8f5427\") " pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.316996 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkg6b\" (UniqueName: \"kubernetes.io/projected/cf6bf9c1-5d43-4ab3-a38f-d96308345ff4-kube-api-access-pkg6b\") pod \"control-plane-machine-set-operator-78cbb6b69f-qr4dp\" (UID: \"cf6bf9c1-5d43-4ab3-a38f-d96308345ff4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr4dp" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.668068 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-96tn8" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.669838 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qpp47" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.670215 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.670523 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5pm5v" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.670632 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2tgl4" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.670749 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.669911 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-45kh6" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.673714 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbjnc" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.676703 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr4dp" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.679575 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: E1216 12:49:11.680687 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:12.180662141 +0000 UTC m=+137.608405957 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.680801 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7e7b566f-4c89-4834-ba16-f5e5286eda7e-trusted-ca\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.680860 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7e7b566f-4c89-4834-ba16-f5e5286eda7e-registry-certificates\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.680956 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4bkw\" (UniqueName: \"kubernetes.io/projected/7e7b566f-4c89-4834-ba16-f5e5286eda7e-kube-api-access-h4bkw\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.681029 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e7b566f-4c89-4834-ba16-f5e5286eda7e-registry-tls\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.681088 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7e7b566f-4c89-4834-ba16-f5e5286eda7e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.681210 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e7b566f-4c89-4834-ba16-f5e5286eda7e-bound-sa-token\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.681254 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7e7b566f-4c89-4834-ba16-f5e5286eda7e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.691132 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" event={"ID":"c67f2cc7-204e-4c8f-9c93-b02372c5c296","Type":"ContainerStarted","Data":"44e202b6b4fbaaabd0df44ec8e012d255a59b2b72d7d2018e903c5edc6667afd"} Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.783453 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:11 crc kubenswrapper[4757]: E1216 12:49:11.785505 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:12.285479077 +0000 UTC m=+137.713222873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.786051 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e7b566f-4c89-4834-ba16-f5e5286eda7e-bound-sa-token\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.786121 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7e7b566f-4c89-4834-ba16-f5e5286eda7e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.786221 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.786249 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7e7b566f-4c89-4834-ba16-f5e5286eda7e-trusted-ca\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.786303 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba004648-a434-472a-9b98-1177f45eb479-trusted-ca\") pod \"ingress-operator-5b745b69d9-6hp8j\" (UID: \"ba004648-a434-472a-9b98-1177f45eb479\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.786328 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7e7b566f-4c89-4834-ba16-f5e5286eda7e-registry-certificates\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.786452 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ba004648-a434-472a-9b98-1177f45eb479-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6hp8j\" (UID: \"ba004648-a434-472a-9b98-1177f45eb479\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.786488 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4bkw\" (UniqueName: \"kubernetes.io/projected/7e7b566f-4c89-4834-ba16-f5e5286eda7e-kube-api-access-h4bkw\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.786538 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba004648-a434-472a-9b98-1177f45eb479-metrics-tls\") pod \"ingress-operator-5b745b69d9-6hp8j\" (UID: \"ba004648-a434-472a-9b98-1177f45eb479\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.786618 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e7b566f-4c89-4834-ba16-f5e5286eda7e-registry-tls\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.786644 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xqwz\" (UniqueName: \"kubernetes.io/projected/ba004648-a434-472a-9b98-1177f45eb479-kube-api-access-6xqwz\") pod \"ingress-operator-5b745b69d9-6hp8j\" (UID: \"ba004648-a434-472a-9b98-1177f45eb479\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.786706 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7e7b566f-4c89-4834-ba16-f5e5286eda7e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.788923 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7e7b566f-4c89-4834-ba16-f5e5286eda7e-registry-certificates\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.790875 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7e7b566f-4c89-4834-ba16-f5e5286eda7e-trusted-ca\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: E1216 12:49:11.791187 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:12.291173999 +0000 UTC m=+137.718917795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.795839 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e7b566f-4c89-4834-ba16-f5e5286eda7e-registry-tls\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.796036 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7e7b566f-4c89-4834-ba16-f5e5286eda7e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.815525 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e7b566f-4c89-4834-ba16-f5e5286eda7e-bound-sa-token\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.823351 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7e7b566f-4c89-4834-ba16-f5e5286eda7e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.839105 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4bkw\" (UniqueName: \"kubernetes.io/projected/7e7b566f-4c89-4834-ba16-f5e5286eda7e-kube-api-access-h4bkw\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.899849 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.899969 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5706c05b-ab36-4ed2-ac86-06146a1bddda-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-crbcx\" (UID: \"5706c05b-ab36-4ed2-ac86-06146a1bddda\") " pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.899993 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4231c2af-eaea-4187-ba50-370c2eb81315-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9nvbm\" (UID: \"4231c2af-eaea-4187-ba50-370c2eb81315\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.900024 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l85r\" (UniqueName: \"kubernetes.io/projected/597d97dd-5d3f-4409-9576-1dbdd245707d-kube-api-access-7l85r\") pod \"machine-config-server-m8zkr\" (UID: \"597d97dd-5d3f-4409-9576-1dbdd245707d\") " pod="openshift-machine-config-operator/machine-config-server-m8zkr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.900060 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba004648-a434-472a-9b98-1177f45eb479-metrics-tls\") pod \"ingress-operator-5b745b69d9-6hp8j\" (UID: \"ba004648-a434-472a-9b98-1177f45eb479\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.900090 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvzs4\" (UniqueName: \"kubernetes.io/projected/2528329d-5900-445c-8539-80caefbe1c15-kube-api-access-xvzs4\") pod \"cluster-image-registry-operator-dc59b4c8b-k5bsp\" (UID: \"2528329d-5900-445c-8539-80caefbe1c15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.900109 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/78d0c760-7b97-4a51-9d17-1409ac8ab2d5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-v479d\" (UID: \"78d0c760-7b97-4a51-9d17-1409ac8ab2d5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.900135 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d862c66e-b538-48e8-bbcb-a0cb2715a7de-secret-volume\") pod \"collect-profiles-29431485-p5xjr\" (UID: \"d862c66e-b538-48e8-bbcb-a0cb2715a7de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr" Dec 16 12:49:11 crc kubenswrapper[4757]: E1216 12:49:11.900167 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:12.400134229 +0000 UTC m=+137.827878045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.900224 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51608011-36e8-4875-b5c9-e9cfb96d1ef1-serving-cert\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.900267 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6a9f576c-c2af-47cb-8a4d-f7d8784aad87-tmpfs\") pod \"packageserver-d55dfcdfc-5l22d\" (UID: \"6a9f576c-c2af-47cb-8a4d-f7d8784aad87\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.900306 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j9zd\" (UniqueName: \"kubernetes.io/projected/6a9f576c-c2af-47cb-8a4d-f7d8784aad87-kube-api-access-7j9zd\") pod \"packageserver-d55dfcdfc-5l22d\" (UID: \"6a9f576c-c2af-47cb-8a4d-f7d8784aad87\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.900342 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d862c66e-b538-48e8-bbcb-a0cb2715a7de-config-volume\") pod \"collect-profiles-29431485-p5xjr\" (UID: \"d862c66e-b538-48e8-bbcb-a0cb2715a7de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.900392 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3e91d20-09bb-4846-bcf5-d90a938ce3c7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xnmf5\" (UID: \"c3e91d20-09bb-4846-bcf5-d90a938ce3c7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xnmf5" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.900439 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/17a05c95-20b3-4922-8bb5-b658f4115b5c-csi-data-dir\") pod \"csi-hostpathplugin-9g4nt\" (UID: \"17a05c95-20b3-4922-8bb5-b658f4115b5c\") " pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.900480 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj52f\" (UniqueName: \"kubernetes.io/projected/5706c05b-ab36-4ed2-ac86-06146a1bddda-kube-api-access-hj52f\") pod \"marketplace-operator-79b997595-crbcx\" (UID: \"5706c05b-ab36-4ed2-ac86-06146a1bddda\") " pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.900497 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4231c2af-eaea-4187-ba50-370c2eb81315-images\") pod \"machine-config-operator-74547568cd-9nvbm\" (UID: \"4231c2af-eaea-4187-ba50-370c2eb81315\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.900532 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0febaaf-2028-4ff9-a751-0714a00a9412-config\") pod \"kube-apiserver-operator-766d6c64bb-m679c\" (UID: \"f0febaaf-2028-4ff9-a751-0714a00a9412\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m679c" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.900596 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/33620485-0085-4ad4-a908-addb88a5d7ec-config-volume\") pod \"dns-default-vdk49\" (UID: \"33620485-0085-4ad4-a908-addb88a5d7ec\") " pod="openshift-dns/dns-default-vdk49" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.900661 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/51608011-36e8-4875-b5c9-e9cfb96d1ef1-node-pullsecrets\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.900684 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33620485-0085-4ad4-a908-addb88a5d7ec-metrics-tls\") pod \"dns-default-vdk49\" (UID: \"33620485-0085-4ad4-a908-addb88a5d7ec\") " pod="openshift-dns/dns-default-vdk49" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.900705 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/78d0c760-7b97-4a51-9d17-1409ac8ab2d5-srv-cert\") pod \"olm-operator-6b444d44fb-v479d\" (UID: \"78d0c760-7b97-4a51-9d17-1409ac8ab2d5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.902194 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-45d7q"] Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903183 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x22j6\" (UniqueName: \"kubernetes.io/projected/265120d9-fc19-4ca6-9ff4-3dbd22bac771-kube-api-access-x22j6\") pod \"catalog-operator-68c6474976-lfff4\" (UID: \"265120d9-fc19-4ca6-9ff4-3dbd22bac771\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903235 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5706c05b-ab36-4ed2-ac86-06146a1bddda-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-crbcx\" (UID: \"5706c05b-ab36-4ed2-ac86-06146a1bddda\") " pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903261 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/17a05c95-20b3-4922-8bb5-b658f4115b5c-socket-dir\") pod \"csi-hostpathplugin-9g4nt\" (UID: \"17a05c95-20b3-4922-8bb5-b658f4115b5c\") " pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903286 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/51608011-36e8-4875-b5c9-e9cfb96d1ef1-image-import-ca\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903363 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/766ff2cb-7de6-4600-a7d1-ed44aa3aed43-signing-cabundle\") pod \"service-ca-9c57cc56f-m8kth\" (UID: \"766ff2cb-7de6-4600-a7d1-ed44aa3aed43\") " pod="openshift-service-ca/service-ca-9c57cc56f-m8kth" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903412 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba004648-a434-472a-9b98-1177f45eb479-trusted-ca\") pod \"ingress-operator-5b745b69d9-6hp8j\" (UID: \"ba004648-a434-472a-9b98-1177f45eb479\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903428 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/17a05c95-20b3-4922-8bb5-b658f4115b5c-mountpoint-dir\") pod \"csi-hostpathplugin-9g4nt\" (UID: \"17a05c95-20b3-4922-8bb5-b658f4115b5c\") " pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903477 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/51608011-36e8-4875-b5c9-e9cfb96d1ef1-encryption-config\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903545 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8xth\" (UniqueName: \"kubernetes.io/projected/9304ff8e-fd87-4758-b291-fb7b8a26c350-kube-api-access-s8xth\") pod \"package-server-manager-789f6589d5-lkb7m\" (UID: \"9304ff8e-fd87-4758-b291-fb7b8a26c350\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lkb7m" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903572 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/17a05c95-20b3-4922-8bb5-b658f4115b5c-plugins-dir\") pod \"csi-hostpathplugin-9g4nt\" (UID: \"17a05c95-20b3-4922-8bb5-b658f4115b5c\") " pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903600 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51608011-36e8-4875-b5c9-e9cfb96d1ef1-config\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903615 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a9f576c-c2af-47cb-8a4d-f7d8784aad87-webhook-cert\") pod \"packageserver-d55dfcdfc-5l22d\" (UID: \"6a9f576c-c2af-47cb-8a4d-f7d8784aad87\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903635 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ba004648-a434-472a-9b98-1177f45eb479-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6hp8j\" (UID: \"ba004648-a434-472a-9b98-1177f45eb479\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903649 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3e91d20-09bb-4846-bcf5-d90a938ce3c7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xnmf5\" (UID: \"c3e91d20-09bb-4846-bcf5-d90a938ce3c7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xnmf5" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903678 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/597d97dd-5d3f-4409-9576-1dbdd245707d-node-bootstrap-token\") pod \"machine-config-server-m8zkr\" (UID: \"597d97dd-5d3f-4409-9576-1dbdd245707d\") " pod="openshift-machine-config-operator/machine-config-server-m8zkr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903693 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/51608011-36e8-4875-b5c9-e9cfb96d1ef1-audit\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903720 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/265120d9-fc19-4ca6-9ff4-3dbd22bac771-profile-collector-cert\") pod \"catalog-operator-68c6474976-lfff4\" (UID: \"265120d9-fc19-4ca6-9ff4-3dbd22bac771\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903750 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xqwz\" (UniqueName: \"kubernetes.io/projected/ba004648-a434-472a-9b98-1177f45eb479-kube-api-access-6xqwz\") pod \"ingress-operator-5b745b69d9-6hp8j\" (UID: \"ba004648-a434-472a-9b98-1177f45eb479\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903772 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/981c93fc-8f6d-4781-a261-8fa8308422cb-cert\") pod \"ingress-canary-57pwb\" (UID: \"981c93fc-8f6d-4781-a261-8fa8308422cb\") " pod="openshift-ingress-canary/ingress-canary-57pwb" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903806 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8j4l\" (UniqueName: \"kubernetes.io/projected/981c93fc-8f6d-4781-a261-8fa8308422cb-kube-api-access-w8j4l\") pod \"ingress-canary-57pwb\" (UID: \"981c93fc-8f6d-4781-a261-8fa8308422cb\") " pod="openshift-ingress-canary/ingress-canary-57pwb" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903822 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/17a05c95-20b3-4922-8bb5-b658f4115b5c-registration-dir\") pod \"csi-hostpathplugin-9g4nt\" (UID: \"17a05c95-20b3-4922-8bb5-b658f4115b5c\") " pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903837 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/51608011-36e8-4875-b5c9-e9cfb96d1ef1-audit-dir\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903870 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/597d97dd-5d3f-4409-9576-1dbdd245707d-certs\") pod \"machine-config-server-m8zkr\" (UID: \"597d97dd-5d3f-4409-9576-1dbdd245707d\") " pod="openshift-machine-config-operator/machine-config-server-m8zkr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903904 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swlht\" (UniqueName: \"kubernetes.io/projected/70232508-1ae7-4b91-bd26-c01e84786364-kube-api-access-swlht\") pod \"migrator-59844c95c7-vlrtg\" (UID: \"70232508-1ae7-4b91-bd26-c01e84786364\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vlrtg" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903929 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j969r\" (UniqueName: \"kubernetes.io/projected/766ff2cb-7de6-4600-a7d1-ed44aa3aed43-kube-api-access-j969r\") pod \"service-ca-9c57cc56f-m8kth\" (UID: \"766ff2cb-7de6-4600-a7d1-ed44aa3aed43\") " pod="openshift-service-ca/service-ca-9c57cc56f-m8kth" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903949 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a9f576c-c2af-47cb-8a4d-f7d8784aad87-apiservice-cert\") pod \"packageserver-d55dfcdfc-5l22d\" (UID: \"6a9f576c-c2af-47cb-8a4d-f7d8784aad87\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903971 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xhtg\" (UniqueName: \"kubernetes.io/projected/78d0c760-7b97-4a51-9d17-1409ac8ab2d5-kube-api-access-8xhtg\") pod \"olm-operator-6b444d44fb-v479d\" (UID: \"78d0c760-7b97-4a51-9d17-1409ac8ab2d5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.903994 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cfmn\" (UniqueName: \"kubernetes.io/projected/17a05c95-20b3-4922-8bb5-b658f4115b5c-kube-api-access-2cfmn\") pod \"csi-hostpathplugin-9g4nt\" (UID: \"17a05c95-20b3-4922-8bb5-b658f4115b5c\") " pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.904045 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51608011-36e8-4875-b5c9-e9cfb96d1ef1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.904072 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/51608011-36e8-4875-b5c9-e9cfb96d1ef1-etcd-serving-ca\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.904108 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c7613a1-c06e-4d91-90e1-309ef18204be-config\") pod \"service-ca-operator-777779d784-qnb8m\" (UID: \"8c7613a1-c06e-4d91-90e1-309ef18204be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qnb8m" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.904132 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/766ff2cb-7de6-4600-a7d1-ed44aa3aed43-signing-key\") pod \"service-ca-9c57cc56f-m8kth\" (UID: \"766ff2cb-7de6-4600-a7d1-ed44aa3aed43\") " pod="openshift-service-ca/service-ca-9c57cc56f-m8kth" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.904193 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9304ff8e-fd87-4758-b291-fb7b8a26c350-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lkb7m\" (UID: \"9304ff8e-fd87-4758-b291-fb7b8a26c350\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lkb7m" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.904213 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c7613a1-c06e-4d91-90e1-309ef18204be-serving-cert\") pod \"service-ca-operator-777779d784-qnb8m\" (UID: \"8c7613a1-c06e-4d91-90e1-309ef18204be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qnb8m" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.904229 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3e91d20-09bb-4846-bcf5-d90a938ce3c7-config\") pod \"kube-controller-manager-operator-78b949d7b-xnmf5\" (UID: \"c3e91d20-09bb-4846-bcf5-d90a938ce3c7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xnmf5" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.904243 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/265120d9-fc19-4ca6-9ff4-3dbd22bac771-srv-cert\") pod \"catalog-operator-68c6474976-lfff4\" (UID: \"265120d9-fc19-4ca6-9ff4-3dbd22bac771\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.904257 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e6e99fe1-d31b-4a78-97fa-360102eab7f1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4485h\" (UID: \"e6e99fe1-d31b-4a78-97fa-360102eab7f1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4485h" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.904678 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmcjl\" (UniqueName: \"kubernetes.io/projected/4231c2af-eaea-4187-ba50-370c2eb81315-kube-api-access-fmcjl\") pod \"machine-config-operator-74547568cd-9nvbm\" (UID: \"4231c2af-eaea-4187-ba50-370c2eb81315\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.904785 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba004648-a434-472a-9b98-1177f45eb479-trusted-ca\") pod \"ingress-operator-5b745b69d9-6hp8j\" (UID: \"ba004648-a434-472a-9b98-1177f45eb479\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.905532 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0febaaf-2028-4ff9-a751-0714a00a9412-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-m679c\" (UID: \"f0febaaf-2028-4ff9-a751-0714a00a9412\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m679c" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.905598 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2528329d-5900-445c-8539-80caefbe1c15-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-k5bsp\" (UID: \"2528329d-5900-445c-8539-80caefbe1c15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.905779 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0febaaf-2028-4ff9-a751-0714a00a9412-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-m679c\" (UID: \"f0febaaf-2028-4ff9-a751-0714a00a9412\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m679c" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.905813 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv4b9\" (UniqueName: \"kubernetes.io/projected/51608011-36e8-4875-b5c9-e9cfb96d1ef1-kube-api-access-vv4b9\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.907501 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2528329d-5900-445c-8539-80caefbe1c15-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-k5bsp\" (UID: \"2528329d-5900-445c-8539-80caefbe1c15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.907572 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jfw6\" (UniqueName: \"kubernetes.io/projected/e6e99fe1-d31b-4a78-97fa-360102eab7f1-kube-api-access-8jfw6\") pod \"multus-admission-controller-857f4d67dd-4485h\" (UID: \"e6e99fe1-d31b-4a78-97fa-360102eab7f1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4485h" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.907607 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxpsn\" (UniqueName: \"kubernetes.io/projected/33620485-0085-4ad4-a908-addb88a5d7ec-kube-api-access-qxpsn\") pod \"dns-default-vdk49\" (UID: \"33620485-0085-4ad4-a908-addb88a5d7ec\") " pod="openshift-dns/dns-default-vdk49" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.907658 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.907676 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/51608011-36e8-4875-b5c9-e9cfb96d1ef1-etcd-client\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.907829 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7g62\" (UniqueName: \"kubernetes.io/projected/8c7613a1-c06e-4d91-90e1-309ef18204be-kube-api-access-j7g62\") pod \"service-ca-operator-777779d784-qnb8m\" (UID: \"8c7613a1-c06e-4d91-90e1-309ef18204be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qnb8m" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.907872 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjfvn\" (UniqueName: \"kubernetes.io/projected/d862c66e-b538-48e8-bbcb-a0cb2715a7de-kube-api-access-xjfvn\") pod \"collect-profiles-29431485-p5xjr\" (UID: \"d862c66e-b538-48e8-bbcb-a0cb2715a7de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.907892 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4231c2af-eaea-4187-ba50-370c2eb81315-proxy-tls\") pod \"machine-config-operator-74547568cd-9nvbm\" (UID: \"4231c2af-eaea-4187-ba50-370c2eb81315\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.907924 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2528329d-5900-445c-8539-80caefbe1c15-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-k5bsp\" (UID: \"2528329d-5900-445c-8539-80caefbe1c15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp" Dec 16 12:49:11 crc kubenswrapper[4757]: E1216 12:49:11.909539 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:12.409516484 +0000 UTC m=+137.837260340 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.912581 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba004648-a434-472a-9b98-1177f45eb479-metrics-tls\") pod \"ingress-operator-5b745b69d9-6hp8j\" (UID: \"ba004648-a434-472a-9b98-1177f45eb479\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j" Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.937732 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xqwz\" (UniqueName: \"kubernetes.io/projected/ba004648-a434-472a-9b98-1177f45eb479-kube-api-access-6xqwz\") pod \"ingress-operator-5b745b69d9-6hp8j\" (UID: \"ba004648-a434-472a-9b98-1177f45eb479\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j" Dec 16 12:49:11 crc kubenswrapper[4757]: W1216 12:49:11.959904 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18370ed0_2552_4394_ab48_5e61b770ad66.slice/crio-c2f26b0c740fcea06dd3825ad2af534d22458382672e5bd2e1ab3d445f17584c WatchSource:0}: Error finding container c2f26b0c740fcea06dd3825ad2af534d22458382672e5bd2e1ab3d445f17584c: Status 404 returned error can't find the container with id c2f26b0c740fcea06dd3825ad2af534d22458382672e5bd2e1ab3d445f17584c Dec 16 12:49:11 crc kubenswrapper[4757]: I1216 12:49:11.963335 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ba004648-a434-472a-9b98-1177f45eb479-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6hp8j\" (UID: \"ba004648-a434-472a-9b98-1177f45eb479\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.009408 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:12 crc kubenswrapper[4757]: E1216 12:49:12.009668 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:12.509646892 +0000 UTC m=+137.937390688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.009900 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/981c93fc-8f6d-4781-a261-8fa8308422cb-cert\") pod \"ingress-canary-57pwb\" (UID: \"981c93fc-8f6d-4781-a261-8fa8308422cb\") " pod="openshift-ingress-canary/ingress-canary-57pwb" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.009927 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8j4l\" (UniqueName: \"kubernetes.io/projected/981c93fc-8f6d-4781-a261-8fa8308422cb-kube-api-access-w8j4l\") pod \"ingress-canary-57pwb\" (UID: \"981c93fc-8f6d-4781-a261-8fa8308422cb\") " pod="openshift-ingress-canary/ingress-canary-57pwb" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.009951 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/597d97dd-5d3f-4409-9576-1dbdd245707d-certs\") pod \"machine-config-server-m8zkr\" (UID: \"597d97dd-5d3f-4409-9576-1dbdd245707d\") " pod="openshift-machine-config-operator/machine-config-server-m8zkr" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.009973 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/17a05c95-20b3-4922-8bb5-b658f4115b5c-registration-dir\") pod \"csi-hostpathplugin-9g4nt\" (UID: \"17a05c95-20b3-4922-8bb5-b658f4115b5c\") " pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.009995 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/51608011-36e8-4875-b5c9-e9cfb96d1ef1-audit-dir\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010112 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xhtg\" (UniqueName: \"kubernetes.io/projected/78d0c760-7b97-4a51-9d17-1409ac8ab2d5-kube-api-access-8xhtg\") pod \"olm-operator-6b444d44fb-v479d\" (UID: \"78d0c760-7b97-4a51-9d17-1409ac8ab2d5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010135 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swlht\" (UniqueName: \"kubernetes.io/projected/70232508-1ae7-4b91-bd26-c01e84786364-kube-api-access-swlht\") pod \"migrator-59844c95c7-vlrtg\" (UID: \"70232508-1ae7-4b91-bd26-c01e84786364\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vlrtg" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010157 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j969r\" (UniqueName: \"kubernetes.io/projected/766ff2cb-7de6-4600-a7d1-ed44aa3aed43-kube-api-access-j969r\") pod \"service-ca-9c57cc56f-m8kth\" (UID: \"766ff2cb-7de6-4600-a7d1-ed44aa3aed43\") " pod="openshift-service-ca/service-ca-9c57cc56f-m8kth" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010190 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a9f576c-c2af-47cb-8a4d-f7d8784aad87-apiservice-cert\") pod \"packageserver-d55dfcdfc-5l22d\" (UID: \"6a9f576c-c2af-47cb-8a4d-f7d8784aad87\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010213 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cfmn\" (UniqueName: \"kubernetes.io/projected/17a05c95-20b3-4922-8bb5-b658f4115b5c-kube-api-access-2cfmn\") pod \"csi-hostpathplugin-9g4nt\" (UID: \"17a05c95-20b3-4922-8bb5-b658f4115b5c\") " pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010235 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51608011-36e8-4875-b5c9-e9cfb96d1ef1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010267 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/51608011-36e8-4875-b5c9-e9cfb96d1ef1-etcd-serving-ca\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010294 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c7613a1-c06e-4d91-90e1-309ef18204be-config\") pod \"service-ca-operator-777779d784-qnb8m\" (UID: \"8c7613a1-c06e-4d91-90e1-309ef18204be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qnb8m" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010313 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/766ff2cb-7de6-4600-a7d1-ed44aa3aed43-signing-key\") pod \"service-ca-9c57cc56f-m8kth\" (UID: \"766ff2cb-7de6-4600-a7d1-ed44aa3aed43\") " pod="openshift-service-ca/service-ca-9c57cc56f-m8kth" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010343 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9304ff8e-fd87-4758-b291-fb7b8a26c350-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lkb7m\" (UID: \"9304ff8e-fd87-4758-b291-fb7b8a26c350\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lkb7m" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010366 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c7613a1-c06e-4d91-90e1-309ef18204be-serving-cert\") pod \"service-ca-operator-777779d784-qnb8m\" (UID: \"8c7613a1-c06e-4d91-90e1-309ef18204be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qnb8m" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010389 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3e91d20-09bb-4846-bcf5-d90a938ce3c7-config\") pod \"kube-controller-manager-operator-78b949d7b-xnmf5\" (UID: \"c3e91d20-09bb-4846-bcf5-d90a938ce3c7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xnmf5" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010410 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/265120d9-fc19-4ca6-9ff4-3dbd22bac771-srv-cert\") pod \"catalog-operator-68c6474976-lfff4\" (UID: \"265120d9-fc19-4ca6-9ff4-3dbd22bac771\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010421 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/17a05c95-20b3-4922-8bb5-b658f4115b5c-registration-dir\") pod \"csi-hostpathplugin-9g4nt\" (UID: \"17a05c95-20b3-4922-8bb5-b658f4115b5c\") " pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010430 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e6e99fe1-d31b-4a78-97fa-360102eab7f1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4485h\" (UID: \"e6e99fe1-d31b-4a78-97fa-360102eab7f1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4485h" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010495 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmcjl\" (UniqueName: \"kubernetes.io/projected/4231c2af-eaea-4187-ba50-370c2eb81315-kube-api-access-fmcjl\") pod \"machine-config-operator-74547568cd-9nvbm\" (UID: \"4231c2af-eaea-4187-ba50-370c2eb81315\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010517 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0febaaf-2028-4ff9-a751-0714a00a9412-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-m679c\" (UID: \"f0febaaf-2028-4ff9-a751-0714a00a9412\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m679c" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010538 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2528329d-5900-445c-8539-80caefbe1c15-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-k5bsp\" (UID: \"2528329d-5900-445c-8539-80caefbe1c15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010555 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0febaaf-2028-4ff9-a751-0714a00a9412-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-m679c\" (UID: \"f0febaaf-2028-4ff9-a751-0714a00a9412\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m679c" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010570 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv4b9\" (UniqueName: \"kubernetes.io/projected/51608011-36e8-4875-b5c9-e9cfb96d1ef1-kube-api-access-vv4b9\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010590 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2528329d-5900-445c-8539-80caefbe1c15-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-k5bsp\" (UID: \"2528329d-5900-445c-8539-80caefbe1c15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010613 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jfw6\" (UniqueName: \"kubernetes.io/projected/e6e99fe1-d31b-4a78-97fa-360102eab7f1-kube-api-access-8jfw6\") pod \"multus-admission-controller-857f4d67dd-4485h\" (UID: \"e6e99fe1-d31b-4a78-97fa-360102eab7f1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4485h" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010632 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxpsn\" (UniqueName: \"kubernetes.io/projected/33620485-0085-4ad4-a908-addb88a5d7ec-kube-api-access-qxpsn\") pod \"dns-default-vdk49\" (UID: \"33620485-0085-4ad4-a908-addb88a5d7ec\") " pod="openshift-dns/dns-default-vdk49" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010656 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010671 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/51608011-36e8-4875-b5c9-e9cfb96d1ef1-etcd-client\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010689 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7g62\" (UniqueName: \"kubernetes.io/projected/8c7613a1-c06e-4d91-90e1-309ef18204be-kube-api-access-j7g62\") pod \"service-ca-operator-777779d784-qnb8m\" (UID: \"8c7613a1-c06e-4d91-90e1-309ef18204be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qnb8m" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010704 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjfvn\" (UniqueName: \"kubernetes.io/projected/d862c66e-b538-48e8-bbcb-a0cb2715a7de-kube-api-access-xjfvn\") pod \"collect-profiles-29431485-p5xjr\" (UID: \"d862c66e-b538-48e8-bbcb-a0cb2715a7de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010721 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4231c2af-eaea-4187-ba50-370c2eb81315-proxy-tls\") pod \"machine-config-operator-74547568cd-9nvbm\" (UID: \"4231c2af-eaea-4187-ba50-370c2eb81315\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010741 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2528329d-5900-445c-8539-80caefbe1c15-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-k5bsp\" (UID: \"2528329d-5900-445c-8539-80caefbe1c15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010758 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5706c05b-ab36-4ed2-ac86-06146a1bddda-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-crbcx\" (UID: \"5706c05b-ab36-4ed2-ac86-06146a1bddda\") " pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010773 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4231c2af-eaea-4187-ba50-370c2eb81315-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9nvbm\" (UID: \"4231c2af-eaea-4187-ba50-370c2eb81315\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010791 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l85r\" (UniqueName: \"kubernetes.io/projected/597d97dd-5d3f-4409-9576-1dbdd245707d-kube-api-access-7l85r\") pod \"machine-config-server-m8zkr\" (UID: \"597d97dd-5d3f-4409-9576-1dbdd245707d\") " pod="openshift-machine-config-operator/machine-config-server-m8zkr" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010813 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvzs4\" (UniqueName: \"kubernetes.io/projected/2528329d-5900-445c-8539-80caefbe1c15-kube-api-access-xvzs4\") pod \"cluster-image-registry-operator-dc59b4c8b-k5bsp\" (UID: \"2528329d-5900-445c-8539-80caefbe1c15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010831 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/78d0c760-7b97-4a51-9d17-1409ac8ab2d5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-v479d\" (UID: \"78d0c760-7b97-4a51-9d17-1409ac8ab2d5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010852 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d862c66e-b538-48e8-bbcb-a0cb2715a7de-secret-volume\") pod \"collect-profiles-29431485-p5xjr\" (UID: \"d862c66e-b538-48e8-bbcb-a0cb2715a7de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010872 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51608011-36e8-4875-b5c9-e9cfb96d1ef1-serving-cert\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010893 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6a9f576c-c2af-47cb-8a4d-f7d8784aad87-tmpfs\") pod \"packageserver-d55dfcdfc-5l22d\" (UID: \"6a9f576c-c2af-47cb-8a4d-f7d8784aad87\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010912 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d862c66e-b538-48e8-bbcb-a0cb2715a7de-config-volume\") pod \"collect-profiles-29431485-p5xjr\" (UID: \"d862c66e-b538-48e8-bbcb-a0cb2715a7de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010932 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j9zd\" (UniqueName: \"kubernetes.io/projected/6a9f576c-c2af-47cb-8a4d-f7d8784aad87-kube-api-access-7j9zd\") pod \"packageserver-d55dfcdfc-5l22d\" (UID: \"6a9f576c-c2af-47cb-8a4d-f7d8784aad87\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010949 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3e91d20-09bb-4846-bcf5-d90a938ce3c7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xnmf5\" (UID: \"c3e91d20-09bb-4846-bcf5-d90a938ce3c7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xnmf5" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010964 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/17a05c95-20b3-4922-8bb5-b658f4115b5c-csi-data-dir\") pod \"csi-hostpathplugin-9g4nt\" (UID: \"17a05c95-20b3-4922-8bb5-b658f4115b5c\") " pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010985 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj52f\" (UniqueName: \"kubernetes.io/projected/5706c05b-ab36-4ed2-ac86-06146a1bddda-kube-api-access-hj52f\") pod \"marketplace-operator-79b997595-crbcx\" (UID: \"5706c05b-ab36-4ed2-ac86-06146a1bddda\") " pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.010999 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4231c2af-eaea-4187-ba50-370c2eb81315-images\") pod \"machine-config-operator-74547568cd-9nvbm\" (UID: \"4231c2af-eaea-4187-ba50-370c2eb81315\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011041 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0febaaf-2028-4ff9-a751-0714a00a9412-config\") pod \"kube-apiserver-operator-766d6c64bb-m679c\" (UID: \"f0febaaf-2028-4ff9-a751-0714a00a9412\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m679c" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011060 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/33620485-0085-4ad4-a908-addb88a5d7ec-config-volume\") pod \"dns-default-vdk49\" (UID: \"33620485-0085-4ad4-a908-addb88a5d7ec\") " pod="openshift-dns/dns-default-vdk49" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011080 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/51608011-36e8-4875-b5c9-e9cfb96d1ef1-node-pullsecrets\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011097 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33620485-0085-4ad4-a908-addb88a5d7ec-metrics-tls\") pod \"dns-default-vdk49\" (UID: \"33620485-0085-4ad4-a908-addb88a5d7ec\") " pod="openshift-dns/dns-default-vdk49" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011112 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/78d0c760-7b97-4a51-9d17-1409ac8ab2d5-srv-cert\") pod \"olm-operator-6b444d44fb-v479d\" (UID: \"78d0c760-7b97-4a51-9d17-1409ac8ab2d5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011129 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x22j6\" (UniqueName: \"kubernetes.io/projected/265120d9-fc19-4ca6-9ff4-3dbd22bac771-kube-api-access-x22j6\") pod \"catalog-operator-68c6474976-lfff4\" (UID: \"265120d9-fc19-4ca6-9ff4-3dbd22bac771\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011146 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5706c05b-ab36-4ed2-ac86-06146a1bddda-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-crbcx\" (UID: \"5706c05b-ab36-4ed2-ac86-06146a1bddda\") " pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011160 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/17a05c95-20b3-4922-8bb5-b658f4115b5c-socket-dir\") pod \"csi-hostpathplugin-9g4nt\" (UID: \"17a05c95-20b3-4922-8bb5-b658f4115b5c\") " pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011177 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/51608011-36e8-4875-b5c9-e9cfb96d1ef1-image-import-ca\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011203 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/766ff2cb-7de6-4600-a7d1-ed44aa3aed43-signing-cabundle\") pod \"service-ca-9c57cc56f-m8kth\" (UID: \"766ff2cb-7de6-4600-a7d1-ed44aa3aed43\") " pod="openshift-service-ca/service-ca-9c57cc56f-m8kth" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011223 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/17a05c95-20b3-4922-8bb5-b658f4115b5c-mountpoint-dir\") pod \"csi-hostpathplugin-9g4nt\" (UID: \"17a05c95-20b3-4922-8bb5-b658f4115b5c\") " pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011247 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/51608011-36e8-4875-b5c9-e9cfb96d1ef1-encryption-config\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011268 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8xth\" (UniqueName: \"kubernetes.io/projected/9304ff8e-fd87-4758-b291-fb7b8a26c350-kube-api-access-s8xth\") pod \"package-server-manager-789f6589d5-lkb7m\" (UID: \"9304ff8e-fd87-4758-b291-fb7b8a26c350\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lkb7m" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011282 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/17a05c95-20b3-4922-8bb5-b658f4115b5c-plugins-dir\") pod \"csi-hostpathplugin-9g4nt\" (UID: \"17a05c95-20b3-4922-8bb5-b658f4115b5c\") " pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011298 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51608011-36e8-4875-b5c9-e9cfb96d1ef1-config\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011312 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a9f576c-c2af-47cb-8a4d-f7d8784aad87-webhook-cert\") pod \"packageserver-d55dfcdfc-5l22d\" (UID: \"6a9f576c-c2af-47cb-8a4d-f7d8784aad87\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011328 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3e91d20-09bb-4846-bcf5-d90a938ce3c7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xnmf5\" (UID: \"c3e91d20-09bb-4846-bcf5-d90a938ce3c7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xnmf5" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011346 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/597d97dd-5d3f-4409-9576-1dbdd245707d-node-bootstrap-token\") pod \"machine-config-server-m8zkr\" (UID: \"597d97dd-5d3f-4409-9576-1dbdd245707d\") " pod="openshift-machine-config-operator/machine-config-server-m8zkr" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011359 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/51608011-36e8-4875-b5c9-e9cfb96d1ef1-audit\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.011377 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/265120d9-fc19-4ca6-9ff4-3dbd22bac771-profile-collector-cert\") pod \"catalog-operator-68c6474976-lfff4\" (UID: \"265120d9-fc19-4ca6-9ff4-3dbd22bac771\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.013427 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/51608011-36e8-4875-b5c9-e9cfb96d1ef1-audit-dir\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: W1216 12:49:12.013515 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1227fe2_f17b_4427_a14d_b5f3be455bf7.slice/crio-2e978ce473c69380cddb3cf151632655e9f608e70bb3f9db3a70346dc391b639 WatchSource:0}: Error finding container 2e978ce473c69380cddb3cf151632655e9f608e70bb3f9db3a70346dc391b639: Status 404 returned error can't find the container with id 2e978ce473c69380cddb3cf151632655e9f608e70bb3f9db3a70346dc391b639 Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.016861 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/51608011-36e8-4875-b5c9-e9cfb96d1ef1-etcd-serving-ca\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.017353 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51608011-36e8-4875-b5c9-e9cfb96d1ef1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.017384 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c7613a1-c06e-4d91-90e1-309ef18204be-config\") pod \"service-ca-operator-777779d784-qnb8m\" (UID: \"8c7613a1-c06e-4d91-90e1-309ef18204be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qnb8m" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.017884 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a9f576c-c2af-47cb-8a4d-f7d8784aad87-apiservice-cert\") pod \"packageserver-d55dfcdfc-5l22d\" (UID: \"6a9f576c-c2af-47cb-8a4d-f7d8784aad87\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" Dec 16 12:49:12 crc kubenswrapper[4757]: E1216 12:49:12.017945 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:12.51792883 +0000 UTC m=+137.945672626 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.018098 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3e91d20-09bb-4846-bcf5-d90a938ce3c7-config\") pod \"kube-controller-manager-operator-78b949d7b-xnmf5\" (UID: \"c3e91d20-09bb-4846-bcf5-d90a938ce3c7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xnmf5" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.018382 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/78d0c760-7b97-4a51-9d17-1409ac8ab2d5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-v479d\" (UID: \"78d0c760-7b97-4a51-9d17-1409ac8ab2d5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.019161 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/265120d9-fc19-4ca6-9ff4-3dbd22bac771-profile-collector-cert\") pod \"catalog-operator-68c6474976-lfff4\" (UID: \"265120d9-fc19-4ca6-9ff4-3dbd22bac771\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.022987 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/597d97dd-5d3f-4409-9576-1dbdd245707d-certs\") pod \"machine-config-server-m8zkr\" (UID: \"597d97dd-5d3f-4409-9576-1dbdd245707d\") " pod="openshift-machine-config-operator/machine-config-server-m8zkr" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.023905 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/981c93fc-8f6d-4781-a261-8fa8308422cb-cert\") pod \"ingress-canary-57pwb\" (UID: \"981c93fc-8f6d-4781-a261-8fa8308422cb\") " pod="openshift-ingress-canary/ingress-canary-57pwb" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.024532 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/766ff2cb-7de6-4600-a7d1-ed44aa3aed43-signing-key\") pod \"service-ca-9c57cc56f-m8kth\" (UID: \"766ff2cb-7de6-4600-a7d1-ed44aa3aed43\") " pod="openshift-service-ca/service-ca-9c57cc56f-m8kth" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.025598 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2528329d-5900-445c-8539-80caefbe1c15-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-k5bsp\" (UID: \"2528329d-5900-445c-8539-80caefbe1c15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.026123 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6a9f576c-c2af-47cb-8a4d-f7d8784aad87-tmpfs\") pod \"packageserver-d55dfcdfc-5l22d\" (UID: \"6a9f576c-c2af-47cb-8a4d-f7d8784aad87\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.026562 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5706c05b-ab36-4ed2-ac86-06146a1bddda-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-crbcx\" (UID: \"5706c05b-ab36-4ed2-ac86-06146a1bddda\") " pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.026683 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d862c66e-b538-48e8-bbcb-a0cb2715a7de-config-volume\") pod \"collect-profiles-29431485-p5xjr\" (UID: \"d862c66e-b538-48e8-bbcb-a0cb2715a7de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.027234 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4231c2af-eaea-4187-ba50-370c2eb81315-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9nvbm\" (UID: \"4231c2af-eaea-4187-ba50-370c2eb81315\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.027868 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e6e99fe1-d31b-4a78-97fa-360102eab7f1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4485h\" (UID: \"e6e99fe1-d31b-4a78-97fa-360102eab7f1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4485h" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.030343 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/17a05c95-20b3-4922-8bb5-b658f4115b5c-socket-dir\") pod \"csi-hostpathplugin-9g4nt\" (UID: \"17a05c95-20b3-4922-8bb5-b658f4115b5c\") " pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.030783 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0febaaf-2028-4ff9-a751-0714a00a9412-config\") pod \"kube-apiserver-operator-766d6c64bb-m679c\" (UID: \"f0febaaf-2028-4ff9-a751-0714a00a9412\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m679c" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.030857 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/17a05c95-20b3-4922-8bb5-b658f4115b5c-csi-data-dir\") pod \"csi-hostpathplugin-9g4nt\" (UID: \"17a05c95-20b3-4922-8bb5-b658f4115b5c\") " pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.031326 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4231c2af-eaea-4187-ba50-370c2eb81315-images\") pod \"machine-config-operator-74547568cd-9nvbm\" (UID: \"4231c2af-eaea-4187-ba50-370c2eb81315\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.031368 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/51608011-36e8-4875-b5c9-e9cfb96d1ef1-node-pullsecrets\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.032035 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/33620485-0085-4ad4-a908-addb88a5d7ec-config-volume\") pod \"dns-default-vdk49\" (UID: \"33620485-0085-4ad4-a908-addb88a5d7ec\") " pod="openshift-dns/dns-default-vdk49" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.032162 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/17a05c95-20b3-4922-8bb5-b658f4115b5c-mountpoint-dir\") pod \"csi-hostpathplugin-9g4nt\" (UID: \"17a05c95-20b3-4922-8bb5-b658f4115b5c\") " pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.032791 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/766ff2cb-7de6-4600-a7d1-ed44aa3aed43-signing-cabundle\") pod \"service-ca-9c57cc56f-m8kth\" (UID: \"766ff2cb-7de6-4600-a7d1-ed44aa3aed43\") " pod="openshift-service-ca/service-ca-9c57cc56f-m8kth" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.033257 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/17a05c95-20b3-4922-8bb5-b658f4115b5c-plugins-dir\") pod \"csi-hostpathplugin-9g4nt\" (UID: \"17a05c95-20b3-4922-8bb5-b658f4115b5c\") " pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.034029 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51608011-36e8-4875-b5c9-e9cfb96d1ef1-config\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.035074 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/265120d9-fc19-4ca6-9ff4-3dbd22bac771-srv-cert\") pod \"catalog-operator-68c6474976-lfff4\" (UID: \"265120d9-fc19-4ca6-9ff4-3dbd22bac771\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.035407 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/51608011-36e8-4875-b5c9-e9cfb96d1ef1-etcd-client\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.035792 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/51608011-36e8-4875-b5c9-e9cfb96d1ef1-image-import-ca\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.037820 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/51608011-36e8-4875-b5c9-e9cfb96d1ef1-audit\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.037908 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4231c2af-eaea-4187-ba50-370c2eb81315-proxy-tls\") pod \"machine-config-operator-74547568cd-9nvbm\" (UID: \"4231c2af-eaea-4187-ba50-370c2eb81315\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.044369 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9304ff8e-fd87-4758-b291-fb7b8a26c350-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lkb7m\" (UID: \"9304ff8e-fd87-4758-b291-fb7b8a26c350\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lkb7m" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.044846 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2528329d-5900-445c-8539-80caefbe1c15-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-k5bsp\" (UID: \"2528329d-5900-445c-8539-80caefbe1c15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.045312 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d862c66e-b538-48e8-bbcb-a0cb2715a7de-secret-volume\") pod \"collect-profiles-29431485-p5xjr\" (UID: \"d862c66e-b538-48e8-bbcb-a0cb2715a7de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.046574 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c7613a1-c06e-4d91-90e1-309ef18204be-serving-cert\") pod \"service-ca-operator-777779d784-qnb8m\" (UID: \"8c7613a1-c06e-4d91-90e1-309ef18204be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qnb8m" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.049975 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0febaaf-2028-4ff9-a751-0714a00a9412-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-m679c\" (UID: \"f0febaaf-2028-4ff9-a751-0714a00a9412\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m679c" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.050508 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51608011-36e8-4875-b5c9-e9cfb96d1ef1-serving-cert\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.050824 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5706c05b-ab36-4ed2-ac86-06146a1bddda-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-crbcx\" (UID: \"5706c05b-ab36-4ed2-ac86-06146a1bddda\") " pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.050943 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a9f576c-c2af-47cb-8a4d-f7d8784aad87-webhook-cert\") pod \"packageserver-d55dfcdfc-5l22d\" (UID: \"6a9f576c-c2af-47cb-8a4d-f7d8784aad87\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.051639 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/78d0c760-7b97-4a51-9d17-1409ac8ab2d5-srv-cert\") pod \"olm-operator-6b444d44fb-v479d\" (UID: \"78d0c760-7b97-4a51-9d17-1409ac8ab2d5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.054636 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33620485-0085-4ad4-a908-addb88a5d7ec-metrics-tls\") pod \"dns-default-vdk49\" (UID: \"33620485-0085-4ad4-a908-addb88a5d7ec\") " pod="openshift-dns/dns-default-vdk49" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.054884 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/597d97dd-5d3f-4409-9576-1dbdd245707d-node-bootstrap-token\") pod \"machine-config-server-m8zkr\" (UID: \"597d97dd-5d3f-4409-9576-1dbdd245707d\") " pod="openshift-machine-config-operator/machine-config-server-m8zkr" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.056239 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/51608011-36e8-4875-b5c9-e9cfb96d1ef1-encryption-config\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.056578 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3e91d20-09bb-4846-bcf5-d90a938ce3c7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xnmf5\" (UID: \"c3e91d20-09bb-4846-bcf5-d90a938ce3c7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xnmf5" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.058692 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xhtg\" (UniqueName: \"kubernetes.io/projected/78d0c760-7b97-4a51-9d17-1409ac8ab2d5-kube-api-access-8xhtg\") pod \"olm-operator-6b444d44fb-v479d\" (UID: \"78d0c760-7b97-4a51-9d17-1409ac8ab2d5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.078949 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swlht\" (UniqueName: \"kubernetes.io/projected/70232508-1ae7-4b91-bd26-c01e84786364-kube-api-access-swlht\") pod \"migrator-59844c95c7-vlrtg\" (UID: \"70232508-1ae7-4b91-bd26-c01e84786364\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vlrtg" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.114521 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:12 crc kubenswrapper[4757]: E1216 12:49:12.115152 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:12.615132044 +0000 UTC m=+138.042875840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.116842 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.118769 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j969r\" (UniqueName: \"kubernetes.io/projected/766ff2cb-7de6-4600-a7d1-ed44aa3aed43-kube-api-access-j969r\") pod \"service-ca-9c57cc56f-m8kth\" (UID: \"766ff2cb-7de6-4600-a7d1-ed44aa3aed43\") " pod="openshift-service-ca/service-ca-9c57cc56f-m8kth" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.125546 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cfmn\" (UniqueName: \"kubernetes.io/projected/17a05c95-20b3-4922-8bb5-b658f4115b5c-kube-api-access-2cfmn\") pod \"csi-hostpathplugin-9g4nt\" (UID: \"17a05c95-20b3-4922-8bb5-b658f4115b5c\") " pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.142417 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxpsn\" (UniqueName: \"kubernetes.io/projected/33620485-0085-4ad4-a908-addb88a5d7ec-kube-api-access-qxpsn\") pod \"dns-default-vdk49\" (UID: \"33620485-0085-4ad4-a908-addb88a5d7ec\") " pod="openshift-dns/dns-default-vdk49" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.165303 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-m8kth" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.179280 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmcjl\" (UniqueName: \"kubernetes.io/projected/4231c2af-eaea-4187-ba50-370c2eb81315-kube-api-access-fmcjl\") pod \"machine-config-operator-74547568cd-9nvbm\" (UID: \"4231c2af-eaea-4187-ba50-370c2eb81315\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.183505 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8j4l\" (UniqueName: \"kubernetes.io/projected/981c93fc-8f6d-4781-a261-8fa8308422cb-kube-api-access-w8j4l\") pod \"ingress-canary-57pwb\" (UID: \"981c93fc-8f6d-4781-a261-8fa8308422cb\") " pod="openshift-ingress-canary/ingress-canary-57pwb" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.216379 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:12 crc kubenswrapper[4757]: E1216 12:49:12.216915 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:12.716875213 +0000 UTC m=+138.144619009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.226691 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0febaaf-2028-4ff9-a751-0714a00a9412-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-m679c\" (UID: \"f0febaaf-2028-4ff9-a751-0714a00a9412\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m679c" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.230611 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7g62\" (UniqueName: \"kubernetes.io/projected/8c7613a1-c06e-4d91-90e1-309ef18204be-kube-api-access-j7g62\") pod \"service-ca-operator-777779d784-qnb8m\" (UID: \"8c7613a1-c06e-4d91-90e1-309ef18204be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qnb8m" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.259407 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m679c" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.264324 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv4b9\" (UniqueName: \"kubernetes.io/projected/51608011-36e8-4875-b5c9-e9cfb96d1ef1-kube-api-access-vv4b9\") pod \"apiserver-76f77b778f-jv58r\" (UID: \"51608011-36e8-4875-b5c9-e9cfb96d1ef1\") " pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.265891 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qnb8m" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.269294 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjfvn\" (UniqueName: \"kubernetes.io/projected/d862c66e-b538-48e8-bbcb-a0cb2715a7de-kube-api-access-xjfvn\") pod \"collect-profiles-29431485-p5xjr\" (UID: \"d862c66e-b538-48e8-bbcb-a0cb2715a7de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.292938 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.293034 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jfw6\" (UniqueName: \"kubernetes.io/projected/e6e99fe1-d31b-4a78-97fa-360102eab7f1-kube-api-access-8jfw6\") pod \"multus-admission-controller-857f4d67dd-4485h\" (UID: \"e6e99fe1-d31b-4a78-97fa-360102eab7f1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4485h" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.313372 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.317321 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:12 crc kubenswrapper[4757]: E1216 12:49:12.317652 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:12.817637797 +0000 UTC m=+138.245381593 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.317700 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l85r\" (UniqueName: \"kubernetes.io/projected/597d97dd-5d3f-4409-9576-1dbdd245707d-kube-api-access-7l85r\") pod \"machine-config-server-m8zkr\" (UID: \"597d97dd-5d3f-4409-9576-1dbdd245707d\") " pod="openshift-machine-config-operator/machine-config-server-m8zkr" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.317736 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4485h" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.327071 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vlrtg" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.343276 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-57pwb" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.360928 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.364111 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvzs4\" (UniqueName: \"kubernetes.io/projected/2528329d-5900-445c-8539-80caefbe1c15-kube-api-access-xvzs4\") pod \"cluster-image-registry-operator-dc59b4c8b-k5bsp\" (UID: \"2528329d-5900-445c-8539-80caefbe1c15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.370431 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2528329d-5900-445c-8539-80caefbe1c15-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-k5bsp\" (UID: \"2528329d-5900-445c-8539-80caefbe1c15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.371100 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x22j6\" (UniqueName: \"kubernetes.io/projected/265120d9-fc19-4ca6-9ff4-3dbd22bac771-kube-api-access-x22j6\") pod \"catalog-operator-68c6474976-lfff4\" (UID: \"265120d9-fc19-4ca6-9ff4-3dbd22bac771\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.373617 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vdk49" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.379778 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j9zd\" (UniqueName: \"kubernetes.io/projected/6a9f576c-c2af-47cb-8a4d-f7d8784aad87-kube-api-access-7j9zd\") pod \"packageserver-d55dfcdfc-5l22d\" (UID: \"6a9f576c-c2af-47cb-8a4d-f7d8784aad87\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.379963 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-m8zkr" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.403776 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8xth\" (UniqueName: \"kubernetes.io/projected/9304ff8e-fd87-4758-b291-fb7b8a26c350-kube-api-access-s8xth\") pod \"package-server-manager-789f6589d5-lkb7m\" (UID: \"9304ff8e-fd87-4758-b291-fb7b8a26c350\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lkb7m" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.417616 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj52f\" (UniqueName: \"kubernetes.io/projected/5706c05b-ab36-4ed2-ac86-06146a1bddda-kube-api-access-hj52f\") pod \"marketplace-operator-79b997595-crbcx\" (UID: \"5706c05b-ab36-4ed2-ac86-06146a1bddda\") " pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.419308 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:12 crc kubenswrapper[4757]: E1216 12:49:12.419887 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:12.919871048 +0000 UTC m=+138.347614844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.431893 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.439508 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3e91d20-09bb-4846-bcf5-d90a938ce3c7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xnmf5\" (UID: \"c3e91d20-09bb-4846-bcf5-d90a938ce3c7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xnmf5" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.523785 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:12 crc kubenswrapper[4757]: E1216 12:49:12.523912 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:13.023892754 +0000 UTC m=+138.451636550 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.524708 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:12 crc kubenswrapper[4757]: E1216 12:49:12.525138 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:13.025122605 +0000 UTC m=+138.452866401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.528806 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.535957 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.543529 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xnmf5" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.554099 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.574200 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.581310 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qpp47"] Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.581509 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.601170 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lkb7m" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.629765 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:12 crc kubenswrapper[4757]: E1216 12:49:12.630259 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:13.130239418 +0000 UTC m=+138.557983214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:12 crc kubenswrapper[4757]: W1216 12:49:12.692066 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd05c7923_2a21_4ed9_84e9_9298b1b0fed9.slice/crio-40da9f79d419b65778e11689c891c16f831c760c41e5ea8589cbb65ab127485e WatchSource:0}: Error finding container 40da9f79d419b65778e11689c891c16f831c760c41e5ea8589cbb65ab127485e: Status 404 returned error can't find the container with id 40da9f79d419b65778e11689c891c16f831c760c41e5ea8589cbb65ab127485e Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.731599 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:12 crc kubenswrapper[4757]: E1216 12:49:12.731861 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:13.231849794 +0000 UTC m=+138.659593590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.736483 4757 generic.go:334] "Generic (PLEG): container finished" podID="18370ed0-2552-4394-ab48-5e61b770ad66" containerID="6322c09ee9a566ef3c3e5f4bed20cdf4781aaf78df2da69c359391f2a40b8d97" exitCode=0 Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.736600 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" event={"ID":"18370ed0-2552-4394-ab48-5e61b770ad66","Type":"ContainerDied","Data":"6322c09ee9a566ef3c3e5f4bed20cdf4781aaf78df2da69c359391f2a40b8d97"} Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.736630 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" event={"ID":"18370ed0-2552-4394-ab48-5e61b770ad66","Type":"ContainerStarted","Data":"c2f26b0c740fcea06dd3825ad2af534d22458382672e5bd2e1ab3d445f17584c"} Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.749160 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" event={"ID":"4daf4899-3f47-4776-b4ef-a54a340e95f5","Type":"ContainerStarted","Data":"785adbfdb794286aaad08d8f21a48e477a85df056cb2ccdcfcce6453a80f5ae4"} Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.750608 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.753324 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-m8zkr" event={"ID":"597d97dd-5d3f-4409-9576-1dbdd245707d","Type":"ContainerStarted","Data":"9ff2986c02e066f09c0922d7e70a347c44c417ec05f5b90eccc7ddadeca7dd78"} Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.774721 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wspqp"] Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.777517 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4hxkq"] Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.784753 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc" event={"ID":"f1227fe2-f17b-4427-a14d-b5f3be455bf7","Type":"ContainerStarted","Data":"2e978ce473c69380cddb3cf151632655e9f608e70bb3f9db3a70346dc391b639"} Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.788876 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ggbw2" event={"ID":"0497d8a5-1e85-4989-8433-6b410d8f5427","Type":"ContainerStarted","Data":"d6df6561d329e8c24f6bbe7e13baa83f6a9bc714b833e80a0fe6d3b9aafaaeac"} Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.788940 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ggbw2" event={"ID":"0497d8a5-1e85-4989-8433-6b410d8f5427","Type":"ContainerStarted","Data":"58f7705188943b538ccb4b4f3803f941ebf3a65e6e4e68d163ed0664df4a2894"} Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.803255 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qpp47" event={"ID":"d05c7923-2a21-4ed9-84e9-9298b1b0fed9","Type":"ContainerStarted","Data":"40da9f79d419b65778e11689c891c16f831c760c41e5ea8589cbb65ab127485e"} Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.806811 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" event={"ID":"c67f2cc7-204e-4c8f-9c93-b02372c5c296","Type":"ContainerStarted","Data":"ff0ea011e2afd0fa1092c8bafd867e2d454b7236061787f7b15f8e42f3081df5"} Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.807385 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.833044 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:12 crc kubenswrapper[4757]: E1216 12:49:12.834481 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:13.334464584 +0000 UTC m=+138.762208380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:12 crc kubenswrapper[4757]: I1216 12:49:12.934924 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:12 crc kubenswrapper[4757]: E1216 12:49:12.935305 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:13.43528594 +0000 UTC m=+138.863029796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:12 crc kubenswrapper[4757]: W1216 12:49:12.943685 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39a3c195_3130_4fbb_903e_1ac8ab630ced.slice/crio-1276dedb8650bb85d3dd185772d51a55f1649f1661bfb8a50be0ca7b8729d2d8 WatchSource:0}: Error finding container 1276dedb8650bb85d3dd185772d51a55f1649f1661bfb8a50be0ca7b8729d2d8: Status 404 returned error can't find the container with id 1276dedb8650bb85d3dd185772d51a55f1649f1661bfb8a50be0ca7b8729d2d8 Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.038570 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:13 crc kubenswrapper[4757]: E1216 12:49:13.038946 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:13.538921136 +0000 UTC m=+138.966664942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.067153 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.075842 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5pm5v"] Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.144133 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:13 crc kubenswrapper[4757]: E1216 12:49:13.144453 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:13.64444153 +0000 UTC m=+139.072185326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.155381 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wg8vk"] Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.248164 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:13 crc kubenswrapper[4757]: E1216 12:49:13.248309 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:13.748287771 +0000 UTC m=+139.176031567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.264528 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:13 crc kubenswrapper[4757]: E1216 12:49:13.265076 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:13.765061081 +0000 UTC m=+139.192804877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.366552 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:13 crc kubenswrapper[4757]: E1216 12:49:13.366653 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:13.866635955 +0000 UTC m=+139.294379751 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.366879 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:13 crc kubenswrapper[4757]: E1216 12:49:13.367186 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:13.86717783 +0000 UTC m=+139.294921626 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.400541 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-ggbw2" podStartSLOduration=119.400524364 podStartE2EDuration="1m59.400524364s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:13.398901034 +0000 UTC m=+138.826644830" watchObservedRunningTime="2025-12-16 12:49:13.400524364 +0000 UTC m=+138.828268160" Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.439728 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" podStartSLOduration=119.439708606 podStartE2EDuration="1m59.439708606s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:13.438336471 +0000 UTC m=+138.866080267" watchObservedRunningTime="2025-12-16 12:49:13.439708606 +0000 UTC m=+138.867452402" Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.468710 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:13 crc kubenswrapper[4757]: E1216 12:49:13.469489 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:13.969473592 +0000 UTC m=+139.397217388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.515951 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" podStartSLOduration=119.515930536 podStartE2EDuration="1m59.515930536s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:13.490352865 +0000 UTC m=+138.918096661" watchObservedRunningTime="2025-12-16 12:49:13.515930536 +0000 UTC m=+138.943674332" Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.571074 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:13 crc kubenswrapper[4757]: E1216 12:49:13.571408 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:14.071395995 +0000 UTC m=+139.499139791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.583998 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.671766 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:13 crc kubenswrapper[4757]: E1216 12:49:13.672210 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:14.17219037 +0000 UTC m=+139.599934166 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.672384 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.682527 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:13 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:13 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:13 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.682572 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.775408 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:13 crc kubenswrapper[4757]: E1216 12:49:13.775788 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:14.275772984 +0000 UTC m=+139.703516780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.817721 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf"] Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.840644 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" event={"ID":"18370ed0-2552-4394-ab48-5e61b770ad66","Type":"ContainerStarted","Data":"f7840bbf02ed3c55ccb4e4dbf9c85c9e3e7d2da0997c2383cc0b417896eb6d8c"} Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.840993 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.862240 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5pm5v" event={"ID":"f6bb180e-b2b2-41c7-b220-71f363518413","Type":"ContainerStarted","Data":"f4e778d5834fa18bae85dcfaff4a30b34fd6b8e932bed46809fb4aa711d378f4"} Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.872824 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4hxkq" event={"ID":"39a3c195-3130-4fbb-903e-1ac8ab630ced","Type":"ContainerStarted","Data":"3a342758cfcc5dd48a4e29a2416efe4ffaa20c8f8c09cc3af8a0f9e6b87f84bc"} Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.872875 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4hxkq" event={"ID":"39a3c195-3130-4fbb-903e-1ac8ab630ced","Type":"ContainerStarted","Data":"1276dedb8650bb85d3dd185772d51a55f1649f1661bfb8a50be0ca7b8729d2d8"} Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.874077 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-4hxkq" Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.875278 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" podStartSLOduration=119.875256867 podStartE2EDuration="1m59.875256867s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:13.873722148 +0000 UTC m=+139.301465964" watchObservedRunningTime="2025-12-16 12:49:13.875256867 +0000 UTC m=+139.303000663" Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.875975 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.876431 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-m8zkr" event={"ID":"597d97dd-5d3f-4409-9576-1dbdd245707d","Type":"ContainerStarted","Data":"3f0526c1fbd5dd0e46d93cc14f7c4dc2fe9532ed086edd8a9eeadb6aa3d3cc1c"} Dec 16 12:49:13 crc kubenswrapper[4757]: E1216 12:49:13.877050 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:14.377031882 +0000 UTC m=+139.804775678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.878111 4757 patch_prober.go:28] interesting pod/downloads-7954f5f757-4hxkq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.878146 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4hxkq" podUID="39a3c195-3130-4fbb-903e-1ac8ab630ced" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.888972 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wg8vk" event={"ID":"ff965e39-8bf4-40d8-b7af-702f0c47bbb4","Type":"ContainerStarted","Data":"7bd6e302be9f54fda07a51964fed05734344451c2c52c30b5be794851a249557"} Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.889028 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wg8vk" event={"ID":"ff965e39-8bf4-40d8-b7af-702f0c47bbb4","Type":"ContainerStarted","Data":"6409067a02018b6ce6e7d00f6005f8013826dd23f531a1910c92489951cb1cf8"} Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.893473 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zlc9d"] Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.908153 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc" event={"ID":"f1227fe2-f17b-4427-a14d-b5f3be455bf7","Type":"ContainerStarted","Data":"0bff205912005a8d681dba5d42f0a313e44f9349001901000847e9adb27a0193"} Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.908408 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc" event={"ID":"f1227fe2-f17b-4427-a14d-b5f3be455bf7","Type":"ContainerStarted","Data":"47598c5a236bdae3237c491da7bd0fbbd64e627a3a92741a34104c16fa24bc98"} Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.926455 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-4hxkq" podStartSLOduration=119.926434609 podStartE2EDuration="1m59.926434609s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:13.905622967 +0000 UTC m=+139.333366773" watchObservedRunningTime="2025-12-16 12:49:13.926434609 +0000 UTC m=+139.354178405" Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.927780 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2tgl4"] Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.935540 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qpp47" event={"ID":"d05c7923-2a21-4ed9-84e9-9298b1b0fed9","Type":"ContainerStarted","Data":"97ba6744647195e6afb457c54268d67d512621f05e37770eeab4fcfe1df320d4"} Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.939581 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-m8zkr" podStartSLOduration=5.939563437 podStartE2EDuration="5.939563437s" podCreationTimestamp="2025-12-16 12:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:13.935543818 +0000 UTC m=+139.363287614" watchObservedRunningTime="2025-12-16 12:49:13.939563437 +0000 UTC m=+139.367307233" Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.973978 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-45kh6"] Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.975264 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wspqp" event={"ID":"fbd13c19-5caf-427a-a09a-1929b550dc04","Type":"ContainerStarted","Data":"de08765872e247bfa5548e3e52edd9122dc970d6d7d30190e09bfd9af944f9d7"} Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.975299 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wspqp" event={"ID":"fbd13c19-5caf-427a-a09a-1929b550dc04","Type":"ContainerStarted","Data":"9e7109c07e2395c9d0f7349fa31f5421c0a7c6f4f18ca94fb2b21875c498dca1"} Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.976500 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zxdsc" podStartSLOduration=119.976481413 podStartE2EDuration="1m59.976481413s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:13.969833136 +0000 UTC m=+139.397576932" watchObservedRunningTime="2025-12-16 12:49:13.976481413 +0000 UTC m=+139.404225209" Dec 16 12:49:13 crc kubenswrapper[4757]: I1216 12:49:13.979882 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:13 crc kubenswrapper[4757]: E1216 12:49:13.982417 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:14.482398171 +0000 UTC m=+139.910142037 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.015217 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qpp47" podStartSLOduration=120.015201942 podStartE2EDuration="2m0.015201942s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:13.996523095 +0000 UTC m=+139.424266891" watchObservedRunningTime="2025-12-16 12:49:14.015201942 +0000 UTC m=+139.442945738" Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.034227 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbjnc"] Dec 16 12:49:14 crc kubenswrapper[4757]: W1216 12:49:14.036433 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0096f9c3_dbd6_42e0_a8a3_00a816988de9.slice/crio-71f434eaa745a282f14b0abe8ede31841f768a45ea0e9e964374ff6cbd7090c6 WatchSource:0}: Error finding container 71f434eaa745a282f14b0abe8ede31841f768a45ea0e9e964374ff6cbd7090c6: Status 404 returned error can't find the container with id 71f434eaa745a282f14b0abe8ede31841f768a45ea0e9e964374ff6cbd7090c6 Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.070246 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xwpj8"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.070589 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-wspqp" podStartSLOduration=120.07057525 podStartE2EDuration="2m0.07057525s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:14.042570758 +0000 UTC m=+139.470314554" watchObservedRunningTime="2025-12-16 12:49:14.07057525 +0000 UTC m=+139.498319046" Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.084240 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-96tn8"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.091750 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:14 crc kubenswrapper[4757]: E1216 12:49:14.093110 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:14.592995281 +0000 UTC m=+140.020739077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.193837 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:14 crc kubenswrapper[4757]: E1216 12:49:14.194501 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:14.694486314 +0000 UTC m=+140.122230110 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.292778 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz84v"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.294890 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:14 crc kubenswrapper[4757]: E1216 12:49:14.295319 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:14.7953031 +0000 UTC m=+140.223046896 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.321717 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m679c"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.364069 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.386555 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.391820 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-m8kth"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.400247 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:14 crc kubenswrapper[4757]: E1216 12:49:14.400610 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:14.900597877 +0000 UTC m=+140.328341673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.424408 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr4dp"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.448078 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.461512 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vdk49"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.475117 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.477399 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hm8ks"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.502366 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:14 crc kubenswrapper[4757]: E1216 12:49:14.502928 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:15.00290798 +0000 UTC m=+140.430651776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:14 crc kubenswrapper[4757]: W1216 12:49:14.528345 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78d0c760_7b97_4a51_9d17_1409ac8ab2d5.slice/crio-e7908110d98f07a2840cce1a084f987808f11bd5f517d0d26095ad95beb667e3 WatchSource:0}: Error finding container e7908110d98f07a2840cce1a084f987808f11bd5f517d0d26095ad95beb667e3: Status 404 returned error can't find the container with id e7908110d98f07a2840cce1a084f987808f11bd5f517d0d26095ad95beb667e3 Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.531901 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vlrtg"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.543140 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.562849 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4485h"] Dec 16 12:49:14 crc kubenswrapper[4757]: W1216 12:49:14.566090 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95287708_9f29_44fd_a6e8_e9ad0b1e934e.slice/crio-9b8ab4c9f08ba24bab1a9cf70af7c7a0a78fe6d8b6ea2e75f2551dbb77b064f3 WatchSource:0}: Error finding container 9b8ab4c9f08ba24bab1a9cf70af7c7a0a78fe6d8b6ea2e75f2551dbb77b064f3: Status 404 returned error can't find the container with id 9b8ab4c9f08ba24bab1a9cf70af7c7a0a78fe6d8b6ea2e75f2551dbb77b064f3 Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.567089 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4j9nz"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.575363 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-57pwb"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.582100 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.604831 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:14 crc kubenswrapper[4757]: E1216 12:49:14.605206 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:15.105193272 +0000 UTC m=+140.532937068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.674668 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-crbcx"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.675813 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:14 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:14 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:14 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.675847 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.705285 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:14 crc kubenswrapper[4757]: E1216 12:49:14.705657 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:15.205643689 +0000 UTC m=+140.633387485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.718515 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9g4nt"] Dec 16 12:49:14 crc kubenswrapper[4757]: W1216 12:49:14.721733 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5706c05b_ab36_4ed2_ac86_06146a1bddda.slice/crio-b24892c6e32775b26858c732932f8ed21cc532c661b333be8495d5d22ba6987a WatchSource:0}: Error finding container b24892c6e32775b26858c732932f8ed21cc532c661b333be8495d5d22ba6987a: Status 404 returned error can't find the container with id b24892c6e32775b26858c732932f8ed21cc532c661b333be8495d5d22ba6987a Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.735171 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qnb8m"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.754924 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xnmf5"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.755465 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lkb7m"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.760980 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.764102 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jv58r"] Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.809937 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:14 crc kubenswrapper[4757]: E1216 12:49:14.810364 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:15.310352581 +0000 UTC m=+140.738096377 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.910628 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:14 crc kubenswrapper[4757]: E1216 12:49:14.910837 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:15.410813828 +0000 UTC m=+140.838557624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:14 crc kubenswrapper[4757]: I1216 12:49:14.911025 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:14 crc kubenswrapper[4757]: E1216 12:49:14.911321 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:15.41130286 +0000 UTC m=+140.839046656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.011586 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:15 crc kubenswrapper[4757]: E1216 12:49:15.011988 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:15.511974482 +0000 UTC m=+140.939718278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.104789 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-96tn8" event={"ID":"c96934c7-7a51-4bc4-8c1b-959334813a98","Type":"ContainerStarted","Data":"7f3fbd5bbafb08687110729faebf474ba9c164d601b09629bad1aa43a0d3a7ad"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.104839 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-96tn8" event={"ID":"c96934c7-7a51-4bc4-8c1b-959334813a98","Type":"ContainerStarted","Data":"5b4f3a876e1fd5cbb4d88d30aafc770bea34e82388d2f92717d45dfa45a2c24a"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.112750 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:15 crc kubenswrapper[4757]: E1216 12:49:15.113113 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:15.613101606 +0000 UTC m=+141.040845402 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.123401 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" event={"ID":"df94dff2-af59-42da-be83-0eb6c9aba353","Type":"ContainerStarted","Data":"f422f02a2395140b5d420219e5a662c4af77f7283a3ff08bac33482c406227a9"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.123440 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" event={"ID":"df94dff2-af59-42da-be83-0eb6c9aba353","Type":"ContainerStarted","Data":"1cadade677b43d6f8031ab7c5a8c091f733492ec996141b25e7232ef9abd34d9"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.124210 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.130024 4757 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xwpj8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.130055 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" podUID="df94dff2-af59-42da-be83-0eb6c9aba353" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.130458 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4" event={"ID":"265120d9-fc19-4ca6-9ff4-3dbd22bac771","Type":"ContainerStarted","Data":"5348b63d0bb0dd549f1b26e596ba61721de41b9854e9ef00ed01c39c00e6e39e"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.130502 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4" event={"ID":"265120d9-fc19-4ca6-9ff4-3dbd22bac771","Type":"ContainerStarted","Data":"8b45cf5dbe107216b7b3bba474999cd23b8dcc2948a72be7f67446afa81c9817"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.131377 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4" Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.148423 4757 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-lfff4 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.148477 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4" podUID="265120d9-fc19-4ca6-9ff4-3dbd22bac771" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.152044 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4485h" event={"ID":"e6e99fe1-d31b-4a78-97fa-360102eab7f1","Type":"ContainerStarted","Data":"fa625871b384a93d33ffb50f66154874b245b5234db525dc652d21a8c524e204"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.182216 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qnb8m" event={"ID":"8c7613a1-c06e-4d91-90e1-309ef18204be","Type":"ContainerStarted","Data":"2ddbf235959dd5c6630a7218dfa23bf9930d2ba4411caacec654aabe195ec9d9"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.203542 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zlc9d" event={"ID":"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920","Type":"ContainerStarted","Data":"89d541bf51e9c4e65e3b0b1196193903b64d10b35405c4db6549ad2f90508059"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.203583 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zlc9d" event={"ID":"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920","Type":"ContainerStarted","Data":"6b0bf25738630c3668a939dd083a58e52c93375512cee087d0272e81010485a0"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.215074 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:15 crc kubenswrapper[4757]: E1216 12:49:15.216357 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:15.716337121 +0000 UTC m=+141.144080917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.230369 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xnmf5" event={"ID":"c3e91d20-09bb-4846-bcf5-d90a938ce3c7","Type":"ContainerStarted","Data":"5d3f74d676277d78d42d660f7632e4767da816d134323e4fb8a30d7219b213ad"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.280317 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-45kh6" event={"ID":"78aba4b4-7b55-4500-ad35-0ffa726b6f4a","Type":"ContainerStarted","Data":"507b1fbce0f980a88a04ddba8006bc9d1208361645ff0fe473e523b2e2975556"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.280390 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-45kh6" event={"ID":"78aba4b4-7b55-4500-ad35-0ffa726b6f4a","Type":"ContainerStarted","Data":"97e3ee0570dc73a25b44885844f52e27000da0b629183796a850e31e61bc0322"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.311526 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr4dp" event={"ID":"cf6bf9c1-5d43-4ab3-a38f-d96308345ff4","Type":"ContainerStarted","Data":"f7b78fa2c021dccc1ad510e32d537206c9744ea5f28dc2fc50ec5902a180f662"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.317778 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.318054 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jv58r" event={"ID":"51608011-36e8-4875-b5c9-e9cfb96d1ef1","Type":"ContainerStarted","Data":"8f357b816217e7d7a9dd3a71762fc4e8ae9db3c5e08b11f45f04895b903524d5"} Dec 16 12:49:15 crc kubenswrapper[4757]: E1216 12:49:15.318982 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:15.818970623 +0000 UTC m=+141.246714419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.325229 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lkb7m" event={"ID":"9304ff8e-fd87-4758-b291-fb7b8a26c350","Type":"ContainerStarted","Data":"07e9575b46bfb461577cc771bc6ffeb68a98c500bc402ec44a60ec893da5b273"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.348477 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j" event={"ID":"ba004648-a434-472a-9b98-1177f45eb479","Type":"ContainerStarted","Data":"85c4d1ee3736f6e546acb057ea5a8184f6ac3a6c6e0d882952e9a17a53a745d4"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.348516 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j" event={"ID":"ba004648-a434-472a-9b98-1177f45eb479","Type":"ContainerStarted","Data":"9550d728d10d1162c4bcdeb1d588d4c00e9996093b8db9ffd28c50371faa3202"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.412820 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5pm5v" event={"ID":"f6bb180e-b2b2-41c7-b220-71f363518413","Type":"ContainerStarted","Data":"eb4a1e938434f5232104966cea8ff98fb578afcba0c0e5b00a7ed6be24c6abd3"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.412874 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5pm5v" event={"ID":"f6bb180e-b2b2-41c7-b220-71f363518413","Type":"ContainerStarted","Data":"72b0791ac8f8bad0168ce18dc526d3c3ce51a44106ba986607d95d8f02446c13"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.418789 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:15 crc kubenswrapper[4757]: E1216 12:49:15.419161 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:15.919142712 +0000 UTC m=+141.346886508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.422097 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-m8kth" event={"ID":"766ff2cb-7de6-4600-a7d1-ed44aa3aed43","Type":"ContainerStarted","Data":"993b996fd3b4f3ccbc82ae8cb67b4bcfd5e28a55bdb3e70676b80bde46848cbe"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.425424 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" event={"ID":"95287708-9f29-44fd-a6e8-e9ad0b1e934e","Type":"ContainerStarted","Data":"9b8ab4c9f08ba24bab1a9cf70af7c7a0a78fe6d8b6ea2e75f2551dbb77b064f3"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.437672 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d" event={"ID":"78d0c760-7b97-4a51-9d17-1409ac8ab2d5","Type":"ContainerStarted","Data":"e7908110d98f07a2840cce1a084f987808f11bd5f517d0d26095ad95beb667e3"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.439043 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp" event={"ID":"2528329d-5900-445c-8539-80caefbe1c15","Type":"ContainerStarted","Data":"69d62d04b6812e158fca1d2a2d7686b1a90bd66513b6f57422511b252a08179f"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.439871 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vdk49" event={"ID":"33620485-0085-4ad4-a908-addb88a5d7ec","Type":"ContainerStarted","Data":"ae7a6c977c52759882c997df68353a5211c621aee5851e9603a9fd4c0f0f417b"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.449181 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" event={"ID":"6a9f576c-c2af-47cb-8a4d-f7d8784aad87","Type":"ContainerStarted","Data":"bfac3a09b98a147c5b3de1676a9bb2ad7018bf3145bd2c42c850a8c7cdb28cf8"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.459130 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wg8vk" event={"ID":"ff965e39-8bf4-40d8-b7af-702f0c47bbb4","Type":"ContainerStarted","Data":"30e10523ab7d6c23ac46fd3e96ef66f43196e40a53c995d049e64ec4c8e36612"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.478886 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr" event={"ID":"d862c66e-b538-48e8-bbcb-a0cb2715a7de","Type":"ContainerStarted","Data":"67d748f91a93e7003c15b9bb0b423843a396f7b322f39e4232d95dbf53942cfb"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.488554 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" event={"ID":"ff15bc2c-c190-41e3-b2a9-58656f51045d","Type":"ContainerStarted","Data":"b776d1e1b8b0ec21d9d71b6e62a5241671f512e486ddb184eb361f7ac51a8646"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.520522 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2tgl4" event={"ID":"822c6393-9fec-4b87-89bd-c67cb487567a","Type":"ContainerStarted","Data":"830a06330b568e935a7ec3887c4392d9f175df8609fcfddae4c513798352b09c"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.520574 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2tgl4" event={"ID":"822c6393-9fec-4b87-89bd-c67cb487567a","Type":"ContainerStarted","Data":"105d4e96e690bec230edc9883b6b8f03973a921ba20715e167d680a5518c4794"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.521518 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:15 crc kubenswrapper[4757]: E1216 12:49:15.524863 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:16.024846251 +0000 UTC m=+141.452590047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.527749 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" event={"ID":"17a05c95-20b3-4922-8bb5-b658f4115b5c","Type":"ContainerStarted","Data":"ab78faae23bf5e2623ffc69a075948097fc4565f87775fbb1cf5b6d03b426c96"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.543897 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" event={"ID":"5706c05b-ab36-4ed2-ac86-06146a1bddda","Type":"ContainerStarted","Data":"b24892c6e32775b26858c732932f8ed21cc532c661b333be8495d5d22ba6987a"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.544092 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vlrtg" event={"ID":"70232508-1ae7-4b91-bd26-c01e84786364","Type":"ContainerStarted","Data":"1bf32cbc092ff2555e996962c7cbc14801395391444feb954f92cfb36ad68393"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.544163 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm" event={"ID":"4231c2af-eaea-4187-ba50-370c2eb81315","Type":"ContainerStarted","Data":"9c15f3a6bdc16c57f026cf8131b03e0584804c9df1bc263b8160cca64cee7662"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.544243 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" event={"ID":"3c2a0527-13e8-48c7-aa7d-992f5c9ca223","Type":"ContainerDied","Data":"02c132ba2deb2aeb01dc5010339c582c49a2a4898219a7f1efd3a0456398c741"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.531102 4757 generic.go:334] "Generic (PLEG): container finished" podID="3c2a0527-13e8-48c7-aa7d-992f5c9ca223" containerID="02c132ba2deb2aeb01dc5010339c582c49a2a4898219a7f1efd3a0456398c741" exitCode=0 Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.544458 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" event={"ID":"3c2a0527-13e8-48c7-aa7d-992f5c9ca223","Type":"ContainerStarted","Data":"5b221eae8025a44a228eb90b824fa0c1717e0efac361504d817317ff05e735c2"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.556189 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbjnc" event={"ID":"0096f9c3-dbd6-42e0-a8a3-00a816988de9","Type":"ContainerStarted","Data":"38dcaed97168f95b275fb9ca136a2f9e6f7db72d093508267ad99f655600d513"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.556225 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbjnc" event={"ID":"0096f9c3-dbd6-42e0-a8a3-00a816988de9","Type":"ContainerStarted","Data":"71f434eaa745a282f14b0abe8ede31841f768a45ea0e9e964374ff6cbd7090c6"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.585598 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz84v" event={"ID":"66c0404f-467f-4b42-8cc6-ee0675ff8a55","Type":"ContainerStarted","Data":"266d55857759099bd0c652e5e506729959b99d4b17bd342577cef6b388517c2b"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.585645 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz84v" event={"ID":"66c0404f-467f-4b42-8cc6-ee0675ff8a55","Type":"ContainerStarted","Data":"2eee44ddb29c352edded1f3260cb05fcce8face16c5de1228f7d3f6b6172e201"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.623642 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:15 crc kubenswrapper[4757]: E1216 12:49:15.624555 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:16.124539788 +0000 UTC m=+141.552283584 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.636283 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-57pwb" event={"ID":"981c93fc-8f6d-4781-a261-8fa8308422cb","Type":"ContainerStarted","Data":"8461125baf327fd3d7211f6b9e62b4d4590e2649e65e80b8ac5de8f91c8839cf"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.638235 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m679c" event={"ID":"f0febaaf-2028-4ff9-a751-0714a00a9412","Type":"ContainerStarted","Data":"2d43410fa3d9206c1880c159a0bfbe0680f47d59117c9444974329e218780c0d"} Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.639868 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-wspqp" Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.646220 4757 patch_prober.go:28] interesting pod/downloads-7954f5f757-4hxkq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.646267 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4hxkq" podUID="39a3c195-3130-4fbb-903e-1ac8ab630ced" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.681831 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:15 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:15 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:15 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.681885 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.725862 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:15 crc kubenswrapper[4757]: E1216 12:49:15.730611 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:16.230596225 +0000 UTC m=+141.658340021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.747920 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-wspqp" Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.833022 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:15 crc kubenswrapper[4757]: E1216 12:49:15.833961 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:16.333943703 +0000 UTC m=+141.761687499 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:15 crc kubenswrapper[4757]: I1216 12:49:15.944364 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:15 crc kubenswrapper[4757]: E1216 12:49:15.944901 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:16.444888513 +0000 UTC m=+141.872632309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.045824 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:16 crc kubenswrapper[4757]: E1216 12:49:16.046244 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:16.546229811 +0000 UTC m=+141.973973607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.147079 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:16 crc kubenswrapper[4757]: E1216 12:49:16.147599 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:16.64758761 +0000 UTC m=+142.075331406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.249095 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:16 crc kubenswrapper[4757]: E1216 12:49:16.249448 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:16.749431991 +0000 UTC m=+142.177175787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.355893 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:16 crc kubenswrapper[4757]: E1216 12:49:16.356281 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:16.856266138 +0000 UTC m=+142.284009934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.406021 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4" podStartSLOduration=122.405987783 podStartE2EDuration="2m2.405987783s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:16.344968124 +0000 UTC m=+141.772711920" watchObservedRunningTime="2025-12-16 12:49:16.405987783 +0000 UTC m=+141.833731579" Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.406825 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-zlc9d" podStartSLOduration=122.406818135 podStartE2EDuration="2m2.406818135s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:16.40545121 +0000 UTC m=+141.833195026" watchObservedRunningTime="2025-12-16 12:49:16.406818135 +0000 UTC m=+141.834561931" Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.458884 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:16 crc kubenswrapper[4757]: E1216 12:49:16.459308 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:16.959288388 +0000 UTC m=+142.387032184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.522973 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m679c" podStartSLOduration=122.522952324 podStartE2EDuration="2m2.522952324s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:16.443490563 +0000 UTC m=+141.871234359" watchObservedRunningTime="2025-12-16 12:49:16.522952324 +0000 UTC m=+141.950696120" Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.565130 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:16 crc kubenswrapper[4757]: E1216 12:49:16.565784 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:17.065769416 +0000 UTC m=+142.493513212 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.567082 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-m8kth" podStartSLOduration=122.567068578 podStartE2EDuration="2m2.567068578s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:16.524053691 +0000 UTC m=+141.951797497" watchObservedRunningTime="2025-12-16 12:49:16.567068578 +0000 UTC m=+141.994812374" Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.621986 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nz84v" podStartSLOduration=122.621964143 podStartE2EDuration="2m2.621964143s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:16.620477656 +0000 UTC m=+142.048221462" watchObservedRunningTime="2025-12-16 12:49:16.621964143 +0000 UTC m=+142.049707939" Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.660788 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-wg8vk" podStartSLOduration=122.660771266 podStartE2EDuration="2m2.660771266s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:16.658387736 +0000 UTC m=+142.086131532" watchObservedRunningTime="2025-12-16 12:49:16.660771266 +0000 UTC m=+142.088515062" Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.674448 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:16 crc kubenswrapper[4757]: E1216 12:49:16.674770 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:17.174753416 +0000 UTC m=+142.602497212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.682234 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:16 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:16 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:16 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.682271 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.686071 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d" event={"ID":"78d0c760-7b97-4a51-9d17-1409ac8ab2d5","Type":"ContainerStarted","Data":"22ec07976f9d30be6fec8f978e8788f8f80f6710daf6318c046c03c1903defa5"} Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.686815 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d" Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.699169 4757 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-v479d container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" start-of-body= Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.699234 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d" podUID="78d0c760-7b97-4a51-9d17-1409ac8ab2d5" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.701109 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2tgl4" event={"ID":"822c6393-9fec-4b87-89bd-c67cb487567a","Type":"ContainerStarted","Data":"855b7a85f0128269397673c9384ba71fb2e1cc1bed5322b8b2143d56a5e4bef1"} Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.729675 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" event={"ID":"95287708-9f29-44fd-a6e8-e9ad0b1e934e","Type":"ContainerStarted","Data":"2475e293b302df0536fee273b7943dfad1421ff24c4981412f8b55631774d5a0"} Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.778232 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm" event={"ID":"4231c2af-eaea-4187-ba50-370c2eb81315","Type":"ContainerStarted","Data":"b06d66805a515862aac3531f1885bf9a1d251039ae42259c086377f0a67823e9"} Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.779023 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:16 crc kubenswrapper[4757]: E1216 12:49:16.780425 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:17.280414913 +0000 UTC m=+142.708158709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.808178 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-96tn8" podStartSLOduration=122.808162218 podStartE2EDuration="2m2.808162218s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:16.807651275 +0000 UTC m=+142.235395071" watchObservedRunningTime="2025-12-16 12:49:16.808162218 +0000 UTC m=+142.235906014" Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.808898 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vdk49" event={"ID":"33620485-0085-4ad4-a908-addb88a5d7ec","Type":"ContainerStarted","Data":"165c0cb6c0fddf273600c7651d701e1f35af09352b670b3573f8c7b941232039"} Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.842784 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vlrtg" event={"ID":"70232508-1ae7-4b91-bd26-c01e84786364","Type":"ContainerStarted","Data":"7308cb598744cfd390cfc079ee9ec9ae0a793c364ee211973035c41911cbec2b"} Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.881812 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr" event={"ID":"d862c66e-b538-48e8-bbcb-a0cb2715a7de","Type":"ContainerStarted","Data":"22218cbb717092852de7d52ee5e3fcb2ec09dfcb7e9a9cd1ada61446d0c8efb4"} Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.882068 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:16 crc kubenswrapper[4757]: E1216 12:49:16.882673 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:17.382654574 +0000 UTC m=+142.810398360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.914687 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-57pwb" event={"ID":"981c93fc-8f6d-4781-a261-8fa8308422cb","Type":"ContainerStarted","Data":"3eed72c0b9ab4ad204651b28503cfc321abfdaa4eaaae654a17739f9430c0017"} Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.966584 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" podStartSLOduration=122.966565176 podStartE2EDuration="2m2.966565176s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:16.943945519 +0000 UTC m=+142.371689315" watchObservedRunningTime="2025-12-16 12:49:16.966565176 +0000 UTC m=+142.394308972" Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.967302 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbjnc" podStartSLOduration=122.967293395 podStartE2EDuration="2m2.967293395s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:16.871882365 +0000 UTC m=+142.299626161" watchObservedRunningTime="2025-12-16 12:49:16.967293395 +0000 UTC m=+142.395037191" Dec 16 12:49:16 crc kubenswrapper[4757]: I1216 12:49:16.984825 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:17 crc kubenswrapper[4757]: E1216 12:49:16.994702 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:17.49468656 +0000 UTC m=+142.922430356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.002219 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr4dp" event={"ID":"cf6bf9c1-5d43-4ab3-a38f-d96308345ff4","Type":"ContainerStarted","Data":"30aaba854ed372b9d3315d66f1a5331fedde5b0cd9bcb3aa4364e5009553872c"} Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.016963 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-5pm5v" podStartSLOduration=123.016934788 podStartE2EDuration="2m3.016934788s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:17.013899252 +0000 UTC m=+142.441643058" watchObservedRunningTime="2025-12-16 12:49:17.016934788 +0000 UTC m=+142.444678584" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.044659 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-m8kth" event={"ID":"766ff2cb-7de6-4600-a7d1-ed44aa3aed43","Type":"ContainerStarted","Data":"810f92cc7f9bc6ba86b44282d85e3f0357448e4f0604cae985469c12112d1585"} Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.070895 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lkb7m" event={"ID":"9304ff8e-fd87-4758-b291-fb7b8a26c350","Type":"ContainerStarted","Data":"49c08c43efe0da88e3cf09387903be2193073e200559b3724f1dd3467edb1d37"} Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.103685 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:17 crc kubenswrapper[4757]: E1216 12:49:17.104842 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:17.60482793 +0000 UTC m=+143.032571726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.115876 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m679c" event={"ID":"f0febaaf-2028-4ff9-a751-0714a00a9412","Type":"ContainerStarted","Data":"198b9e03de990d3f290ef4946600837bf2923588e9522838314664e633898a78"} Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.115935 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.119955 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qnb8m" event={"ID":"8c7613a1-c06e-4d91-90e1-309ef18204be","Type":"ContainerStarted","Data":"91e8baf7987d584e8b34d7f255be22409a918fca8b1535610a726293b96f4878"} Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.138903 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" event={"ID":"6a9f576c-c2af-47cb-8a4d-f7d8784aad87","Type":"ContainerStarted","Data":"3a8e747ab0d135e7447e6c2be939a8321e74beae1e64e9aaf4d1d04f881d37a0"} Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.139665 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.147802 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-45kh6" event={"ID":"78aba4b4-7b55-4500-ad35-0ffa726b6f4a","Type":"ContainerStarted","Data":"5bd4465fb2e1e67f1d960f0ebbba87acc235d00c31e137895a98f7db6caecd24"} Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.151171 4757 patch_prober.go:28] interesting pod/downloads-7954f5f757-4hxkq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.151233 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4hxkq" podUID="39a3c195-3130-4fbb-903e-1ac8ab630ced" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.167281 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lfff4" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.190382 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.205116 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:17 crc kubenswrapper[4757]: E1216 12:49:17.207227 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:17.707215454 +0000 UTC m=+143.134959250 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.306118 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.306429 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d" podStartSLOduration=123.306406559 podStartE2EDuration="2m3.306406559s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:17.299960789 +0000 UTC m=+142.727704585" watchObservedRunningTime="2025-12-16 12:49:17.306406559 +0000 UTC m=+142.734150375" Dec 16 12:49:17 crc kubenswrapper[4757]: E1216 12:49:17.306516 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:17.806502712 +0000 UTC m=+143.234246508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.307892 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-57pwb" podStartSLOduration=9.307886366 podStartE2EDuration="9.307886366s" podCreationTimestamp="2025-12-16 12:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:17.147618712 +0000 UTC m=+142.575362508" watchObservedRunningTime="2025-12-16 12:49:17.307886366 +0000 UTC m=+142.735630162" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.410705 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:17 crc kubenswrapper[4757]: E1216 12:49:17.410966 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:17.910955079 +0000 UTC m=+143.338698865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.512363 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:17 crc kubenswrapper[4757]: E1216 12:49:17.512524 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:18.012496332 +0000 UTC m=+143.440240128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.512860 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:17 crc kubenswrapper[4757]: E1216 12:49:17.513190 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:18.01318145 +0000 UTC m=+143.440925246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.579137 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2tgl4" podStartSLOduration=123.579120901 podStartE2EDuration="2m3.579120901s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:17.478852329 +0000 UTC m=+142.906596125" watchObservedRunningTime="2025-12-16 12:49:17.579120901 +0000 UTC m=+143.006864697" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.579299 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr" podStartSLOduration=123.579295696 podStartE2EDuration="2m3.579295696s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:17.576339672 +0000 UTC m=+143.004083468" watchObservedRunningTime="2025-12-16 12:49:17.579295696 +0000 UTC m=+143.007039492" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.613856 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:17 crc kubenswrapper[4757]: E1216 12:49:17.614101 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:18.114076087 +0000 UTC m=+143.541819883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.614364 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:17 crc kubenswrapper[4757]: E1216 12:49:17.614718 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:18.114702592 +0000 UTC m=+143.542446388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.635734 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr4dp" podStartSLOduration=123.635713759 podStartE2EDuration="2m3.635713759s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:17.63374228 +0000 UTC m=+143.061486076" watchObservedRunningTime="2025-12-16 12:49:17.635713759 +0000 UTC m=+143.063457555" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.682544 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:17 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:17 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:17 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.682598 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.715540 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:17 crc kubenswrapper[4757]: E1216 12:49:17.715912 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:18.215897128 +0000 UTC m=+143.643640914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.734969 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" podStartSLOduration=123.734952105 podStartE2EDuration="2m3.734952105s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:17.70241939 +0000 UTC m=+143.130163186" watchObservedRunningTime="2025-12-16 12:49:17.734952105 +0000 UTC m=+143.162695911" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.780970 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-hm8ks" podStartSLOduration=123.780951907 podStartE2EDuration="2m3.780951907s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:17.735709044 +0000 UTC m=+143.163452840" watchObservedRunningTime="2025-12-16 12:49:17.780951907 +0000 UTC m=+143.208695703" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.817876 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:17 crc kubenswrapper[4757]: E1216 12:49:17.818232 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:18.318217661 +0000 UTC m=+143.745961457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.835106 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qnb8m" podStartSLOduration=123.835090694 podStartE2EDuration="2m3.835090694s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:17.832896598 +0000 UTC m=+143.260640384" watchObservedRunningTime="2025-12-16 12:49:17.835090694 +0000 UTC m=+143.262834490" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.872733 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7ppdm"] Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.874224 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7ppdm" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.885185 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.905403 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7ppdm"] Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.918498 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.918717 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ab79c2-762d-4773-ae6e-6e92acdf4508-catalog-content\") pod \"community-operators-7ppdm\" (UID: \"c8ab79c2-762d-4773-ae6e-6e92acdf4508\") " pod="openshift-marketplace/community-operators-7ppdm" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.918800 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq7jn\" (UniqueName: \"kubernetes.io/projected/c8ab79c2-762d-4773-ae6e-6e92acdf4508-kube-api-access-cq7jn\") pod \"community-operators-7ppdm\" (UID: \"c8ab79c2-762d-4773-ae6e-6e92acdf4508\") " pod="openshift-marketplace/community-operators-7ppdm" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.918841 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ab79c2-762d-4773-ae6e-6e92acdf4508-utilities\") pod \"community-operators-7ppdm\" (UID: \"c8ab79c2-762d-4773-ae6e-6e92acdf4508\") " pod="openshift-marketplace/community-operators-7ppdm" Dec 16 12:49:17 crc kubenswrapper[4757]: E1216 12:49:17.918941 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:18.418926434 +0000 UTC m=+143.846670220 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.990747 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-45kh6" podStartSLOduration=123.990728322 podStartE2EDuration="2m3.990728322s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:17.990288402 +0000 UTC m=+143.418032208" watchObservedRunningTime="2025-12-16 12:49:17.990728322 +0000 UTC m=+143.418472118" Dec 16 12:49:17 crc kubenswrapper[4757]: I1216 12:49:17.991901 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" podStartSLOduration=123.991890392 podStartE2EDuration="2m3.991890392s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:17.938755351 +0000 UTC m=+143.366499147" watchObservedRunningTime="2025-12-16 12:49:17.991890392 +0000 UTC m=+143.419634208" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.020627 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq7jn\" (UniqueName: \"kubernetes.io/projected/c8ab79c2-762d-4773-ae6e-6e92acdf4508-kube-api-access-cq7jn\") pod \"community-operators-7ppdm\" (UID: \"c8ab79c2-762d-4773-ae6e-6e92acdf4508\") " pod="openshift-marketplace/community-operators-7ppdm" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.020835 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.020873 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ab79c2-762d-4773-ae6e-6e92acdf4508-utilities\") pod \"community-operators-7ppdm\" (UID: \"c8ab79c2-762d-4773-ae6e-6e92acdf4508\") " pod="openshift-marketplace/community-operators-7ppdm" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.020920 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ab79c2-762d-4773-ae6e-6e92acdf4508-catalog-content\") pod \"community-operators-7ppdm\" (UID: \"c8ab79c2-762d-4773-ae6e-6e92acdf4508\") " pod="openshift-marketplace/community-operators-7ppdm" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.021358 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ab79c2-762d-4773-ae6e-6e92acdf4508-catalog-content\") pod \"community-operators-7ppdm\" (UID: \"c8ab79c2-762d-4773-ae6e-6e92acdf4508\") " pod="openshift-marketplace/community-operators-7ppdm" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.021549 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ab79c2-762d-4773-ae6e-6e92acdf4508-utilities\") pod \"community-operators-7ppdm\" (UID: \"c8ab79c2-762d-4773-ae6e-6e92acdf4508\") " pod="openshift-marketplace/community-operators-7ppdm" Dec 16 12:49:18 crc kubenswrapper[4757]: E1216 12:49:18.022144 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:18.522126459 +0000 UTC m=+143.949870255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.047102 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vqtlh"] Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.048388 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqtlh" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.056375 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.090039 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq7jn\" (UniqueName: \"kubernetes.io/projected/c8ab79c2-762d-4773-ae6e-6e92acdf4508-kube-api-access-cq7jn\") pod \"community-operators-7ppdm\" (UID: \"c8ab79c2-762d-4773-ae6e-6e92acdf4508\") " pod="openshift-marketplace/community-operators-7ppdm" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.123183 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.123471 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7320d121-c9e6-4af2-ad14-4db89ea38a9e-catalog-content\") pod \"certified-operators-vqtlh\" (UID: \"7320d121-c9e6-4af2-ad14-4db89ea38a9e\") " pod="openshift-marketplace/certified-operators-vqtlh" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.123558 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjt2g\" (UniqueName: \"kubernetes.io/projected/7320d121-c9e6-4af2-ad14-4db89ea38a9e-kube-api-access-pjt2g\") pod \"certified-operators-vqtlh\" (UID: \"7320d121-c9e6-4af2-ad14-4db89ea38a9e\") " pod="openshift-marketplace/certified-operators-vqtlh" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.123601 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7320d121-c9e6-4af2-ad14-4db89ea38a9e-utilities\") pod \"certified-operators-vqtlh\" (UID: \"7320d121-c9e6-4af2-ad14-4db89ea38a9e\") " pod="openshift-marketplace/certified-operators-vqtlh" Dec 16 12:49:18 crc kubenswrapper[4757]: E1216 12:49:18.123813 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:18.623782985 +0000 UTC m=+144.051526781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.139601 4757 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5l22d container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.139952 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" podUID="6a9f576c-c2af-47cb-8a4d-f7d8784aad87" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.155559 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqtlh"] Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.164930 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xnmf5" event={"ID":"c3e91d20-09bb-4846-bcf5-d90a938ce3c7","Type":"ContainerStarted","Data":"112cffb302fdf3a1f6b6d8906cc345d759a367e847f60e2a5bf767ff05c02cad"} Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.171803 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vlrtg" event={"ID":"70232508-1ae7-4b91-bd26-c01e84786364","Type":"ContainerStarted","Data":"bc95bc1e5eab67170919b6270b6b07bcb692f98aee51a03be547b8990e2aabdf"} Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.176283 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm" event={"ID":"4231c2af-eaea-4187-ba50-370c2eb81315","Type":"ContainerStarted","Data":"e2e18301fcadbe72110b43dff48370f91150bf2b658df20145f7eec0e132bc64"} Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.180282 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" event={"ID":"3c2a0527-13e8-48c7-aa7d-992f5c9ca223","Type":"ContainerStarted","Data":"45a9e569981a2e2f7f7fa25602965885f4e94d5bb4ce6940ca5187bcc9c71868"} Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.182318 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4485h" event={"ID":"e6e99fe1-d31b-4a78-97fa-360102eab7f1","Type":"ContainerStarted","Data":"1c782ec0a0040600aa8bc36939ba3f0707b7eed67ff44fd73c41e8b101ebfad2"} Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.182346 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4485h" event={"ID":"e6e99fe1-d31b-4a78-97fa-360102eab7f1","Type":"ContainerStarted","Data":"abb79c6276a8b3a649707ab8a4741a6161134406757ab83a65e609c8653230fa"} Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.185413 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j" event={"ID":"ba004648-a434-472a-9b98-1177f45eb479","Type":"ContainerStarted","Data":"c8d1a65d7e84b480de4bd0a8ca76a01cd3212bf4b7524dccf9080c865253ec99"} Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.198178 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vdk49" event={"ID":"33620485-0085-4ad4-a908-addb88a5d7ec","Type":"ContainerStarted","Data":"bb139b2a2a40622888f9ea8eff82f6ba9006e3a1cb4fe2623f1e26e7e30c2ffe"} Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.198885 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-vdk49" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.204694 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7ppdm" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.204753 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lkb7m" event={"ID":"9304ff8e-fd87-4758-b291-fb7b8a26c350","Type":"ContainerStarted","Data":"0e2265e863a44241ae39b8bd31d22bc162a0d0de71e586a3675e2057bc0dffe2"} Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.205354 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lkb7m" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.225750 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7320d121-c9e6-4af2-ad14-4db89ea38a9e-utilities\") pod \"certified-operators-vqtlh\" (UID: \"7320d121-c9e6-4af2-ad14-4db89ea38a9e\") " pod="openshift-marketplace/certified-operators-vqtlh" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.225888 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7320d121-c9e6-4af2-ad14-4db89ea38a9e-catalog-content\") pod \"certified-operators-vqtlh\" (UID: \"7320d121-c9e6-4af2-ad14-4db89ea38a9e\") " pod="openshift-marketplace/certified-operators-vqtlh" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.225993 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.226072 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjt2g\" (UniqueName: \"kubernetes.io/projected/7320d121-c9e6-4af2-ad14-4db89ea38a9e-kube-api-access-pjt2g\") pod \"certified-operators-vqtlh\" (UID: \"7320d121-c9e6-4af2-ad14-4db89ea38a9e\") " pod="openshift-marketplace/certified-operators-vqtlh" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.227986 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7320d121-c9e6-4af2-ad14-4db89ea38a9e-utilities\") pod \"certified-operators-vqtlh\" (UID: \"7320d121-c9e6-4af2-ad14-4db89ea38a9e\") " pod="openshift-marketplace/certified-operators-vqtlh" Dec 16 12:49:18 crc kubenswrapper[4757]: E1216 12:49:18.230512 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:18.730494169 +0000 UTC m=+144.158238055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.243330 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7320d121-c9e6-4af2-ad14-4db89ea38a9e-catalog-content\") pod \"certified-operators-vqtlh\" (UID: \"7320d121-c9e6-4af2-ad14-4db89ea38a9e\") " pod="openshift-marketplace/certified-operators-vqtlh" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.244147 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xnmf5" podStartSLOduration=124.24413316 podStartE2EDuration="2m4.24413316s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:18.190292101 +0000 UTC m=+143.618035897" watchObservedRunningTime="2025-12-16 12:49:18.24413316 +0000 UTC m=+143.671876956" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.256045 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-4485h" podStartSLOduration=124.256022568 podStartE2EDuration="2m4.256022568s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:18.227057183 +0000 UTC m=+143.654800979" watchObservedRunningTime="2025-12-16 12:49:18.256022568 +0000 UTC m=+143.683766364" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.256609 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" event={"ID":"5706c05b-ab36-4ed2-ac86-06146a1bddda","Type":"ContainerStarted","Data":"16838a1f8a7a7bcbdd967839d36df6d30ef8312ac5f3969b68c679221368b9e6"} Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.257924 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.272119 4757 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-crbcx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.272176 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" podUID="5706c05b-ab36-4ed2-ac86-06146a1bddda" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.272486 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bbrff"] Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.273649 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbrff" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.278054 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp" event={"ID":"2528329d-5900-445c-8539-80caefbe1c15","Type":"ContainerStarted","Data":"b6add25506cedb04210d094dc2755b4e13579f1525faf75d510a283bdf33d838"} Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.304160 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4j9nz" event={"ID":"ff15bc2c-c190-41e3-b2a9-58656f51045d","Type":"ContainerStarted","Data":"c4084df2b2adafdc6562d580de88f151a7e1a1c8b2392140e093a68cdf01b10a"} Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.324266 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjt2g\" (UniqueName: \"kubernetes.io/projected/7320d121-c9e6-4af2-ad14-4db89ea38a9e-kube-api-access-pjt2g\") pod \"certified-operators-vqtlh\" (UID: \"7320d121-c9e6-4af2-ad14-4db89ea38a9e\") " pod="openshift-marketplace/certified-operators-vqtlh" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.330819 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.331088 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7q6n\" (UniqueName: \"kubernetes.io/projected/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7-kube-api-access-q7q6n\") pod \"community-operators-bbrff\" (UID: \"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7\") " pod="openshift-marketplace/community-operators-bbrff" Dec 16 12:49:18 crc kubenswrapper[4757]: E1216 12:49:18.331167 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:18.83114109 +0000 UTC m=+144.258884886 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.331478 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7-catalog-content\") pod \"community-operators-bbrff\" (UID: \"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7\") " pod="openshift-marketplace/community-operators-bbrff" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.331558 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7-utilities\") pod \"community-operators-bbrff\" (UID: \"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7\") " pod="openshift-marketplace/community-operators-bbrff" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.331609 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:18 crc kubenswrapper[4757]: E1216 12:49:18.331878 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:18.831869839 +0000 UTC m=+144.259613705 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.349592 4757 generic.go:334] "Generic (PLEG): container finished" podID="51608011-36e8-4875-b5c9-e9cfb96d1ef1" containerID="1eaa99cc65a252f84cbe574125993b32f669648802a472b15d420ee6a18d6f5d" exitCode=0 Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.349977 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jv58r" event={"ID":"51608011-36e8-4875-b5c9-e9cfb96d1ef1","Type":"ContainerDied","Data":"1eaa99cc65a252f84cbe574125993b32f669648802a472b15d420ee6a18d6f5d"} Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.368987 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vlrtg" podStartSLOduration=124.368972618 podStartE2EDuration="2m4.368972618s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:18.367633194 +0000 UTC m=+143.795376990" watchObservedRunningTime="2025-12-16 12:49:18.368972618 +0000 UTC m=+143.796716414" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.369293 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqtlh" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.386131 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bbrff"] Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.389202 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" event={"ID":"17a05c95-20b3-4922-8bb5-b658f4115b5c","Type":"ContainerStarted","Data":"36bc5c7fb878e846de301eaa9283ac8085619b72daf778ebac6bc3c2d7519537"} Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.419284 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v479d" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.435440 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.435915 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7-catalog-content\") pod \"community-operators-bbrff\" (UID: \"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7\") " pod="openshift-marketplace/community-operators-bbrff" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.436058 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7-utilities\") pod \"community-operators-bbrff\" (UID: \"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7\") " pod="openshift-marketplace/community-operators-bbrff" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.436162 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7q6n\" (UniqueName: \"kubernetes.io/projected/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7-kube-api-access-q7q6n\") pod \"community-operators-bbrff\" (UID: \"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7\") " pod="openshift-marketplace/community-operators-bbrff" Dec 16 12:49:18 crc kubenswrapper[4757]: E1216 12:49:18.436710 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:18.936659973 +0000 UTC m=+144.364403819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.438279 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7-utilities\") pod \"community-operators-bbrff\" (UID: \"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7\") " pod="openshift-marketplace/community-operators-bbrff" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.441998 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7-catalog-content\") pod \"community-operators-bbrff\" (UID: \"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7\") " pod="openshift-marketplace/community-operators-bbrff" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.484251 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9nvbm" podStartSLOduration=124.484229955 podStartE2EDuration="2m4.484229955s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:18.482720667 +0000 UTC m=+143.910464463" watchObservedRunningTime="2025-12-16 12:49:18.484229955 +0000 UTC m=+143.911973741" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.486222 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-srvcx"] Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.487416 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srvcx" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.540329 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgw9n\" (UniqueName: \"kubernetes.io/projected/8bca5421-7190-4386-9a4d-fc01e88be52e-kube-api-access-zgw9n\") pod \"certified-operators-srvcx\" (UID: \"8bca5421-7190-4386-9a4d-fc01e88be52e\") " pod="openshift-marketplace/certified-operators-srvcx" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.540400 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bca5421-7190-4386-9a4d-fc01e88be52e-catalog-content\") pod \"certified-operators-srvcx\" (UID: \"8bca5421-7190-4386-9a4d-fc01e88be52e\") " pod="openshift-marketplace/certified-operators-srvcx" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.540494 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.540516 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bca5421-7190-4386-9a4d-fc01e88be52e-utilities\") pod \"certified-operators-srvcx\" (UID: \"8bca5421-7190-4386-9a4d-fc01e88be52e\") " pod="openshift-marketplace/certified-operators-srvcx" Dec 16 12:49:18 crc kubenswrapper[4757]: E1216 12:49:18.540887 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:19.040876394 +0000 UTC m=+144.468620190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.543563 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7q6n\" (UniqueName: \"kubernetes.io/projected/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7-kube-api-access-q7q6n\") pod \"community-operators-bbrff\" (UID: \"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7\") " pod="openshift-marketplace/community-operators-bbrff" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.563346 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-srvcx"] Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.603339 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbrff" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.646462 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.646690 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bca5421-7190-4386-9a4d-fc01e88be52e-catalog-content\") pod \"certified-operators-srvcx\" (UID: \"8bca5421-7190-4386-9a4d-fc01e88be52e\") " pod="openshift-marketplace/certified-operators-srvcx" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.646785 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bca5421-7190-4386-9a4d-fc01e88be52e-utilities\") pod \"certified-operators-srvcx\" (UID: \"8bca5421-7190-4386-9a4d-fc01e88be52e\") " pod="openshift-marketplace/certified-operators-srvcx" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.646829 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgw9n\" (UniqueName: \"kubernetes.io/projected/8bca5421-7190-4386-9a4d-fc01e88be52e-kube-api-access-zgw9n\") pod \"certified-operators-srvcx\" (UID: \"8bca5421-7190-4386-9a4d-fc01e88be52e\") " pod="openshift-marketplace/certified-operators-srvcx" Dec 16 12:49:18 crc kubenswrapper[4757]: E1216 12:49:18.647204 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:19.147186047 +0000 UTC m=+144.574929843 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.647539 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bca5421-7190-4386-9a4d-fc01e88be52e-catalog-content\") pod \"certified-operators-srvcx\" (UID: \"8bca5421-7190-4386-9a4d-fc01e88be52e\") " pod="openshift-marketplace/certified-operators-srvcx" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.647765 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bca5421-7190-4386-9a4d-fc01e88be52e-utilities\") pod \"certified-operators-srvcx\" (UID: \"8bca5421-7190-4386-9a4d-fc01e88be52e\") " pod="openshift-marketplace/certified-operators-srvcx" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.682978 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:18 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:18 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:18 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.683038 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.718014 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6hp8j" podStartSLOduration=124.71798311 podStartE2EDuration="2m4.71798311s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:18.702295357 +0000 UTC m=+144.130039153" watchObservedRunningTime="2025-12-16 12:49:18.71798311 +0000 UTC m=+144.145726906" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.719189 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" podStartSLOduration=124.71918281 podStartE2EDuration="2m4.71918281s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:18.623118305 +0000 UTC m=+144.050862101" watchObservedRunningTime="2025-12-16 12:49:18.71918281 +0000 UTC m=+144.146926606" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.749927 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:18 crc kubenswrapper[4757]: E1216 12:49:18.750825 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:19.250802653 +0000 UTC m=+144.678546449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.808677 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgw9n\" (UniqueName: \"kubernetes.io/projected/8bca5421-7190-4386-9a4d-fc01e88be52e-kube-api-access-zgw9n\") pod \"certified-operators-srvcx\" (UID: \"8bca5421-7190-4386-9a4d-fc01e88be52e\") " pod="openshift-marketplace/certified-operators-srvcx" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.814193 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srvcx" Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.860730 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:18 crc kubenswrapper[4757]: E1216 12:49:18.861054 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:19.361039564 +0000 UTC m=+144.788783360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.961840 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:18 crc kubenswrapper[4757]: E1216 12:49:18.962224 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:19.462199329 +0000 UTC m=+144.889943125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:18 crc kubenswrapper[4757]: I1216 12:49:18.983427 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vdk49" podStartSLOduration=10.98340858 podStartE2EDuration="10.98340858s" podCreationTimestamp="2025-12-16 12:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:18.981553534 +0000 UTC m=+144.409297330" watchObservedRunningTime="2025-12-16 12:49:18.98340858 +0000 UTC m=+144.411152376" Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.062535 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:19 crc kubenswrapper[4757]: E1216 12:49:19.063183 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:19.563168908 +0000 UTC m=+144.990912704 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.091628 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lkb7m" podStartSLOduration=125.09161512 podStartE2EDuration="2m5.09161512s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:19.09042981 +0000 UTC m=+144.518173616" watchObservedRunningTime="2025-12-16 12:49:19.09161512 +0000 UTC m=+144.519358916" Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.170364 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:19 crc kubenswrapper[4757]: E1216 12:49:19.170693 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:19.670677601 +0000 UTC m=+145.098421397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.271060 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:19 crc kubenswrapper[4757]: E1216 12:49:19.271457 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:19.771440345 +0000 UTC m=+145.199184141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.382311 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:19 crc kubenswrapper[4757]: E1216 12:49:19.382811 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:19.882798885 +0000 UTC m=+145.310542681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.408653 4757 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5l22d container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.408704 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" podUID="6a9f576c-c2af-47cb-8a4d-f7d8784aad87" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.409412 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" podStartSLOduration=125.409399941 podStartE2EDuration="2m5.409399941s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:19.306489323 +0000 UTC m=+144.734233129" watchObservedRunningTime="2025-12-16 12:49:19.409399941 +0000 UTC m=+144.837143737" Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.414670 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jv58r" event={"ID":"51608011-36e8-4875-b5c9-e9cfb96d1ef1","Type":"ContainerStarted","Data":"399e5bd8f2e03db32a499aa5a28fd663037a975cf7b15cedba074d923463f93d"} Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.435548 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" event={"ID":"17a05c95-20b3-4922-8bb5-b658f4115b5c","Type":"ContainerStarted","Data":"1d66750229be76cad446af27ad9799c351437e0d6f4027aa183cb67d4d12a749"} Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.439727 4757 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-crbcx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.439968 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" podUID="5706c05b-ab36-4ed2-ac86-06146a1bddda" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.488650 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:19 crc kubenswrapper[4757]: E1216 12:49:19.489098 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:19.989077967 +0000 UTC m=+145.416821763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.502559 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k5bsp" podStartSLOduration=125.502541575 podStartE2EDuration="2m5.502541575s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:19.502228806 +0000 UTC m=+144.929972612" watchObservedRunningTime="2025-12-16 12:49:19.502541575 +0000 UTC m=+144.930285361" Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.590165 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:19 crc kubenswrapper[4757]: E1216 12:49:19.593543 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:20.093527024 +0000 UTC m=+145.521270900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.647265 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bbrff"] Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.677163 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:19 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:19 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:19 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.677675 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.700146 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:19 crc kubenswrapper[4757]: E1216 12:49:19.700505 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:20.200458212 +0000 UTC m=+145.628202008 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.813239 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:19 crc kubenswrapper[4757]: E1216 12:49:19.813556 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:20.313541155 +0000 UTC m=+145.741284951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.840212 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.915155 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:19 crc kubenswrapper[4757]: E1216 12:49:19.915662 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:20.415643473 +0000 UTC m=+145.843387269 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:19 crc kubenswrapper[4757]: I1216 12:49:19.922366 4757 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.016777 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:20 crc kubenswrapper[4757]: E1216 12:49:20.017381 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:20.517368461 +0000 UTC m=+145.945112257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.017916 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqtlh"] Dec 16 12:49:20 crc kubenswrapper[4757]: W1216 12:49:20.072350 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7320d121_c9e6_4af2_ad14_4db89ea38a9e.slice/crio-b97732ad2a152909e27c0054434d7e3dfdc2f2baf7869704a7bc644c39b762e2 WatchSource:0}: Error finding container b97732ad2a152909e27c0054434d7e3dfdc2f2baf7869704a7bc644c39b762e2: Status 404 returned error can't find the container with id b97732ad2a152909e27c0054434d7e3dfdc2f2baf7869704a7bc644c39b762e2 Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.115478 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-krvn2"] Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.116692 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krvn2" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.124724 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:20 crc kubenswrapper[4757]: E1216 12:49:20.125157 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 12:49:20.6251378 +0000 UTC m=+146.052881596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.127408 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.142763 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krvn2"] Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.215090 4757 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-16T12:49:19.922396451Z","Handler":null,"Name":""} Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.227635 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17e402cb-44b0-4232-8671-b7db09c8e9b1-catalog-content\") pod \"redhat-marketplace-krvn2\" (UID: \"17e402cb-44b0-4232-8671-b7db09c8e9b1\") " pod="openshift-marketplace/redhat-marketplace-krvn2" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.227695 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17e402cb-44b0-4232-8671-b7db09c8e9b1-utilities\") pod \"redhat-marketplace-krvn2\" (UID: \"17e402cb-44b0-4232-8671-b7db09c8e9b1\") " pod="openshift-marketplace/redhat-marketplace-krvn2" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.227731 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lngbs\" (UniqueName: \"kubernetes.io/projected/17e402cb-44b0-4232-8671-b7db09c8e9b1-kube-api-access-lngbs\") pod \"redhat-marketplace-krvn2\" (UID: \"17e402cb-44b0-4232-8671-b7db09c8e9b1\") " pod="openshift-marketplace/redhat-marketplace-krvn2" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.227764 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:20 crc kubenswrapper[4757]: E1216 12:49:20.228095 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 12:49:20.72808425 +0000 UTC m=+146.155828046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ws9qr" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.273517 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-srvcx"] Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.274152 4757 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.274182 4757 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.327346 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7ppdm"] Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.330233 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.330431 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17e402cb-44b0-4232-8671-b7db09c8e9b1-catalog-content\") pod \"redhat-marketplace-krvn2\" (UID: \"17e402cb-44b0-4232-8671-b7db09c8e9b1\") " pod="openshift-marketplace/redhat-marketplace-krvn2" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.330473 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17e402cb-44b0-4232-8671-b7db09c8e9b1-utilities\") pod \"redhat-marketplace-krvn2\" (UID: \"17e402cb-44b0-4232-8671-b7db09c8e9b1\") " pod="openshift-marketplace/redhat-marketplace-krvn2" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.330526 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lngbs\" (UniqueName: \"kubernetes.io/projected/17e402cb-44b0-4232-8671-b7db09c8e9b1-kube-api-access-lngbs\") pod \"redhat-marketplace-krvn2\" (UID: \"17e402cb-44b0-4232-8671-b7db09c8e9b1\") " pod="openshift-marketplace/redhat-marketplace-krvn2" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.335296 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17e402cb-44b0-4232-8671-b7db09c8e9b1-catalog-content\") pod \"redhat-marketplace-krvn2\" (UID: \"17e402cb-44b0-4232-8671-b7db09c8e9b1\") " pod="openshift-marketplace/redhat-marketplace-krvn2" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.335753 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17e402cb-44b0-4232-8671-b7db09c8e9b1-utilities\") pod \"redhat-marketplace-krvn2\" (UID: \"17e402cb-44b0-4232-8671-b7db09c8e9b1\") " pod="openshift-marketplace/redhat-marketplace-krvn2" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.418602 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lngbs\" (UniqueName: \"kubernetes.io/projected/17e402cb-44b0-4232-8671-b7db09c8e9b1-kube-api-access-lngbs\") pod \"redhat-marketplace-krvn2\" (UID: \"17e402cb-44b0-4232-8671-b7db09c8e9b1\") " pod="openshift-marketplace/redhat-marketplace-krvn2" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.475821 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x4dkc"] Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.476758 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4dkc" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.486311 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krvn2" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.491340 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" event={"ID":"17a05c95-20b3-4922-8bb5-b658f4115b5c","Type":"ContainerStarted","Data":"7e2218235bcc3a1f9d9352ae961e6c14572768c309bba19989ede801d6f5d70c"} Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.504075 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbrff" event={"ID":"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7","Type":"ContainerStarted","Data":"57766291ef0e249ea295ad5a1b26395a4c9538dc9af82fc22f1e35e1c8c3a587"} Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.504111 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbrff" event={"ID":"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7","Type":"ContainerStarted","Data":"a3e9c04c3b1022bd86a34470adf71fd5e2564643a59e613fcf0b475c880e27dd"} Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.532277 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ppdm" event={"ID":"c8ab79c2-762d-4773-ae6e-6e92acdf4508","Type":"ContainerStarted","Data":"95f8b5349b38ecc911f7e9c9934ea7fb5172cc619213ace00f8a0e52952b7d06"} Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.533283 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4dkc"] Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.554687 4757 generic.go:334] "Generic (PLEG): container finished" podID="7320d121-c9e6-4af2-ad14-4db89ea38a9e" containerID="5fbdffd3c3bd81970eb94f3fa08ee4c7c65bd5c2d16f9d1d86384bdeeebbce73" exitCode=0 Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.554774 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqtlh" event={"ID":"7320d121-c9e6-4af2-ad14-4db89ea38a9e","Type":"ContainerDied","Data":"5fbdffd3c3bd81970eb94f3fa08ee4c7c65bd5c2d16f9d1d86384bdeeebbce73"} Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.554799 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqtlh" event={"ID":"7320d121-c9e6-4af2-ad14-4db89ea38a9e","Type":"ContainerStarted","Data":"b97732ad2a152909e27c0054434d7e3dfdc2f2baf7869704a7bc644c39b762e2"} Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.556650 4757 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.563244 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srvcx" event={"ID":"8bca5421-7190-4386-9a4d-fc01e88be52e","Type":"ContainerStarted","Data":"bdae43a78c0411976da8284ea438f5c12ce33702205089d919a1080ee4f2c0d8"} Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.567744 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jv58r" event={"ID":"51608011-36e8-4875-b5c9-e9cfb96d1ef1","Type":"ContainerStarted","Data":"0fa56a895956fc567b5f8040c3ef587f72ac92d154cd201c5bef676222d45b85"} Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.569164 4757 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-crbcx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.569202 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" podUID="5706c05b-ab36-4ed2-ac86-06146a1bddda" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.635928 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a2e530d-95e8-4bdc-8710-b08bb4d99f17-catalog-content\") pod \"redhat-marketplace-x4dkc\" (UID: \"2a2e530d-95e8-4bdc-8710-b08bb4d99f17\") " pod="openshift-marketplace/redhat-marketplace-x4dkc" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.635976 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zzvr\" (UniqueName: \"kubernetes.io/projected/2a2e530d-95e8-4bdc-8710-b08bb4d99f17-kube-api-access-6zzvr\") pod \"redhat-marketplace-x4dkc\" (UID: \"2a2e530d-95e8-4bdc-8710-b08bb4d99f17\") " pod="openshift-marketplace/redhat-marketplace-x4dkc" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.636020 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a2e530d-95e8-4bdc-8710-b08bb4d99f17-utilities\") pod \"redhat-marketplace-x4dkc\" (UID: \"2a2e530d-95e8-4bdc-8710-b08bb4d99f17\") " pod="openshift-marketplace/redhat-marketplace-x4dkc" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.678837 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.685747 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:20 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:20 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:20 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.685802 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.737589 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.737663 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a2e530d-95e8-4bdc-8710-b08bb4d99f17-catalog-content\") pod \"redhat-marketplace-x4dkc\" (UID: \"2a2e530d-95e8-4bdc-8710-b08bb4d99f17\") " pod="openshift-marketplace/redhat-marketplace-x4dkc" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.737724 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zzvr\" (UniqueName: \"kubernetes.io/projected/2a2e530d-95e8-4bdc-8710-b08bb4d99f17-kube-api-access-6zzvr\") pod \"redhat-marketplace-x4dkc\" (UID: \"2a2e530d-95e8-4bdc-8710-b08bb4d99f17\") " pod="openshift-marketplace/redhat-marketplace-x4dkc" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.737799 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a2e530d-95e8-4bdc-8710-b08bb4d99f17-utilities\") pod \"redhat-marketplace-x4dkc\" (UID: \"2a2e530d-95e8-4bdc-8710-b08bb4d99f17\") " pod="openshift-marketplace/redhat-marketplace-x4dkc" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.740411 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a2e530d-95e8-4bdc-8710-b08bb4d99f17-catalog-content\") pod \"redhat-marketplace-x4dkc\" (UID: \"2a2e530d-95e8-4bdc-8710-b08bb4d99f17\") " pod="openshift-marketplace/redhat-marketplace-x4dkc" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.742164 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a2e530d-95e8-4bdc-8710-b08bb4d99f17-utilities\") pod \"redhat-marketplace-x4dkc\" (UID: \"2a2e530d-95e8-4bdc-8710-b08bb4d99f17\") " pod="openshift-marketplace/redhat-marketplace-x4dkc" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.791920 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-jv58r" podStartSLOduration=126.791884303 podStartE2EDuration="2m6.791884303s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:20.785982095 +0000 UTC m=+146.213725891" watchObservedRunningTime="2025-12-16 12:49:20.791884303 +0000 UTC m=+146.219628099" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.794977 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zzvr\" (UniqueName: \"kubernetes.io/projected/2a2e530d-95e8-4bdc-8710-b08bb4d99f17-kube-api-access-6zzvr\") pod \"redhat-marketplace-x4dkc\" (UID: \"2a2e530d-95e8-4bdc-8710-b08bb4d99f17\") " pod="openshift-marketplace/redhat-marketplace-x4dkc" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.798045 4757 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.798084 4757 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.808369 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4dkc" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.958079 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.962112 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.970690 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:20 crc kubenswrapper[4757]: I1216 12:49:20.991584 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.022858 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.023908 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.024870 4757 patch_prober.go:28] interesting pod/console-f9d7485db-zlc9d container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.024922 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-zlc9d" podUID="99d216d0-ff66-4cbe-a5ac-c2ca6a41f920" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.042454 4757 patch_prober.go:28] interesting pod/downloads-7954f5f757-4hxkq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.042503 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-4hxkq" podUID="39a3c195-3130-4fbb-903e-1ac8ab630ced" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.042743 4757 patch_prober.go:28] interesting pod/downloads-7954f5f757-4hxkq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.042765 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4hxkq" podUID="39a3c195-3130-4fbb-903e-1ac8ab630ced" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.043313 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.043616 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.044382 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.048444 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.051277 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-smj7p"] Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.052248 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smj7p" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.053976 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.090911 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.096109 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smj7p"] Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.144710 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx48j\" (UniqueName: \"kubernetes.io/projected/7c067ef6-5957-4cfd-be96-788f4236d990-kube-api-access-xx48j\") pod \"redhat-operators-smj7p\" (UID: \"7c067ef6-5957-4cfd-be96-788f4236d990\") " pod="openshift-marketplace/redhat-operators-smj7p" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.145061 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c067ef6-5957-4cfd-be96-788f4236d990-utilities\") pod \"redhat-operators-smj7p\" (UID: \"7c067ef6-5957-4cfd-be96-788f4236d990\") " pod="openshift-marketplace/redhat-operators-smj7p" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.145085 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.145122 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c067ef6-5957-4cfd-be96-788f4236d990-catalog-content\") pod \"redhat-operators-smj7p\" (UID: \"7c067ef6-5957-4cfd-be96-788f4236d990\") " pod="openshift-marketplace/redhat-operators-smj7p" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.145203 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.154224 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.157570 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.181698 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.181739 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.247637 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx48j\" (UniqueName: \"kubernetes.io/projected/7c067ef6-5957-4cfd-be96-788f4236d990-kube-api-access-xx48j\") pod \"redhat-operators-smj7p\" (UID: \"7c067ef6-5957-4cfd-be96-788f4236d990\") " pod="openshift-marketplace/redhat-operators-smj7p" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.247728 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c067ef6-5957-4cfd-be96-788f4236d990-utilities\") pod \"redhat-operators-smj7p\" (UID: \"7c067ef6-5957-4cfd-be96-788f4236d990\") " pod="openshift-marketplace/redhat-operators-smj7p" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.247758 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c067ef6-5957-4cfd-be96-788f4236d990-catalog-content\") pod \"redhat-operators-smj7p\" (UID: \"7c067ef6-5957-4cfd-be96-788f4236d990\") " pod="openshift-marketplace/redhat-operators-smj7p" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.248489 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c067ef6-5957-4cfd-be96-788f4236d990-utilities\") pod \"redhat-operators-smj7p\" (UID: \"7c067ef6-5957-4cfd-be96-788f4236d990\") " pod="openshift-marketplace/redhat-operators-smj7p" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.248696 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c067ef6-5957-4cfd-be96-788f4236d990-catalog-content\") pod \"redhat-operators-smj7p\" (UID: \"7c067ef6-5957-4cfd-be96-788f4236d990\") " pod="openshift-marketplace/redhat-operators-smj7p" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.271468 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2hf5q"] Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.272748 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hf5q" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.306436 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx48j\" (UniqueName: \"kubernetes.io/projected/7c067ef6-5957-4cfd-be96-788f4236d990-kube-api-access-xx48j\") pod \"redhat-operators-smj7p\" (UID: \"7c067ef6-5957-4cfd-be96-788f4236d990\") " pod="openshift-marketplace/redhat-operators-smj7p" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.323904 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ws9qr\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.345522 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2hf5q"] Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.374651 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.377912 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.378385 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smj7p" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.457105 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48lxm\" (UniqueName: \"kubernetes.io/projected/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1-kube-api-access-48lxm\") pod \"redhat-operators-2hf5q\" (UID: \"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1\") " pod="openshift-marketplace/redhat-operators-2hf5q" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.457343 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1-catalog-content\") pod \"redhat-operators-2hf5q\" (UID: \"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1\") " pod="openshift-marketplace/redhat-operators-2hf5q" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.457374 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1-utilities\") pod \"redhat-operators-2hf5q\" (UID: \"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1\") " pod="openshift-marketplace/redhat-operators-2hf5q" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.524461 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.558647 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48lxm\" (UniqueName: \"kubernetes.io/projected/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1-kube-api-access-48lxm\") pod \"redhat-operators-2hf5q\" (UID: \"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1\") " pod="openshift-marketplace/redhat-operators-2hf5q" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.558695 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1-catalog-content\") pod \"redhat-operators-2hf5q\" (UID: \"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1\") " pod="openshift-marketplace/redhat-operators-2hf5q" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.558922 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1-utilities\") pod \"redhat-operators-2hf5q\" (UID: \"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1\") " pod="openshift-marketplace/redhat-operators-2hf5q" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.559857 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1-utilities\") pod \"redhat-operators-2hf5q\" (UID: \"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1\") " pod="openshift-marketplace/redhat-operators-2hf5q" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.560340 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1-catalog-content\") pod \"redhat-operators-2hf5q\" (UID: \"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1\") " pod="openshift-marketplace/redhat-operators-2hf5q" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.567579 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.606045 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48lxm\" (UniqueName: \"kubernetes.io/projected/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1-kube-api-access-48lxm\") pod \"redhat-operators-2hf5q\" (UID: \"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1\") " pod="openshift-marketplace/redhat-operators-2hf5q" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.616307 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" event={"ID":"17a05c95-20b3-4922-8bb5-b658f4115b5c","Type":"ContainerStarted","Data":"3ab8b91220cf7359d21e053993844285b14c586166f5b5193254e954b338b692"} Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.634227 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hf5q" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.640403 4757 generic.go:334] "Generic (PLEG): container finished" podID="c8ab79c2-762d-4773-ae6e-6e92acdf4508" containerID="75489ef8f8e1fcdd8e0568d9e53afc31c17552e0d65055379fbc048ef47347b6" exitCode=0 Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.640493 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ppdm" event={"ID":"c8ab79c2-762d-4773-ae6e-6e92acdf4508","Type":"ContainerDied","Data":"75489ef8f8e1fcdd8e0568d9e53afc31c17552e0d65055379fbc048ef47347b6"} Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.660835 4757 generic.go:334] "Generic (PLEG): container finished" podID="ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7" containerID="57766291ef0e249ea295ad5a1b26395a4c9538dc9af82fc22f1e35e1c8c3a587" exitCode=0 Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.660971 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbrff" event={"ID":"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7","Type":"ContainerDied","Data":"57766291ef0e249ea295ad5a1b26395a4c9538dc9af82fc22f1e35e1c8c3a587"} Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.674036 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.680203 4757 generic.go:334] "Generic (PLEG): container finished" podID="8bca5421-7190-4386-9a4d-fc01e88be52e" containerID="f5fb11d547abc28016931b5b5569be3b6edf46368958ef9280e271e41c63e2c2" exitCode=0 Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.681420 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srvcx" event={"ID":"8bca5421-7190-4386-9a4d-fc01e88be52e","Type":"ContainerDied","Data":"f5fb11d547abc28016931b5b5569be3b6edf46368958ef9280e271e41c63e2c2"} Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.704227 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wwkqf" Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.844050 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:21 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:21 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:21 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:21 crc kubenswrapper[4757]: I1216 12:49:21.844127 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:22 crc kubenswrapper[4757]: I1216 12:49:22.003323 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-9g4nt" podStartSLOduration=14.00329879 podStartE2EDuration="14.00329879s" podCreationTimestamp="2025-12-16 12:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:21.940255591 +0000 UTC m=+147.367999387" watchObservedRunningTime="2025-12-16 12:49:22.00329879 +0000 UTC m=+147.431042586" Dec 16 12:49:22 crc kubenswrapper[4757]: I1216 12:49:22.019956 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krvn2"] Dec 16 12:49:22 crc kubenswrapper[4757]: I1216 12:49:22.237658 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4dkc"] Dec 16 12:49:22 crc kubenswrapper[4757]: I1216 12:49:22.538790 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:22 crc kubenswrapper[4757]: I1216 12:49:22.539154 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:22 crc kubenswrapper[4757]: I1216 12:49:22.578828 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:49:22 crc kubenswrapper[4757]: I1216 12:49:22.678179 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:22 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:22 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:22 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:22 crc kubenswrapper[4757]: I1216 12:49:22.678234 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:22 crc kubenswrapper[4757]: I1216 12:49:22.720502 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4dkc" event={"ID":"2a2e530d-95e8-4bdc-8710-b08bb4d99f17","Type":"ContainerStarted","Data":"92a85932e771ebf9368c9939a24cbb785b401ae69edec2d6a3959098f1ae7492"} Dec 16 12:49:22 crc kubenswrapper[4757]: I1216 12:49:22.722376 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krvn2" event={"ID":"17e402cb-44b0-4232-8671-b7db09c8e9b1","Type":"ContainerStarted","Data":"cd66641f2e9daf099a6f7366b5d73db4cd88703d3b706c58783053f94e00516a"} Dec 16 12:49:22 crc kubenswrapper[4757]: I1216 12:49:22.766245 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smj7p"] Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.010765 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ws9qr"] Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.196768 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2hf5q"] Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.517854 4757 patch_prober.go:28] interesting pod/apiserver-76f77b778f-jv58r container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 16 12:49:23 crc kubenswrapper[4757]: [+]log ok Dec 16 12:49:23 crc kubenswrapper[4757]: [+]etcd ok Dec 16 12:49:23 crc kubenswrapper[4757]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 16 12:49:23 crc kubenswrapper[4757]: [+]poststarthook/generic-apiserver-start-informers ok Dec 16 12:49:23 crc kubenswrapper[4757]: [+]poststarthook/max-in-flight-filter ok Dec 16 12:49:23 crc kubenswrapper[4757]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 16 12:49:23 crc kubenswrapper[4757]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 16 12:49:23 crc kubenswrapper[4757]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 16 12:49:23 crc kubenswrapper[4757]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 16 12:49:23 crc kubenswrapper[4757]: [+]poststarthook/project.openshift.io-projectcache ok Dec 16 12:49:23 crc kubenswrapper[4757]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 16 12:49:23 crc kubenswrapper[4757]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Dec 16 12:49:23 crc kubenswrapper[4757]: [-]poststarthook/openshift.io-restmapperupdater failed: reason withheld Dec 16 12:49:23 crc kubenswrapper[4757]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 16 12:49:23 crc kubenswrapper[4757]: livez check failed Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.518180 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-jv58r" podUID="51608011-36e8-4875-b5c9-e9cfb96d1ef1" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.675159 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:23 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:23 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:23 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.676097 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.758117 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.758913 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.765274 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.766416 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.771754 4757 generic.go:334] "Generic (PLEG): container finished" podID="2a2e530d-95e8-4bdc-8710-b08bb4d99f17" containerID="5a6c2d37b5633c81d1da4998e68e80ba878aa599fa842923e07daaf4e7876dd8" exitCode=0 Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.772654 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.772727 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4dkc" event={"ID":"2a2e530d-95e8-4bdc-8710-b08bb4d99f17","Type":"ContainerDied","Data":"5a6c2d37b5633c81d1da4998e68e80ba878aa599fa842923e07daaf4e7876dd8"} Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.805193 4757 generic.go:334] "Generic (PLEG): container finished" podID="7c067ef6-5957-4cfd-be96-788f4236d990" containerID="c869c9b88fe0ad11355187532c5b80d581a666eaeb3f70aba6bf0d417fe96e5c" exitCode=0 Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.805533 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smj7p" event={"ID":"7c067ef6-5957-4cfd-be96-788f4236d990","Type":"ContainerDied","Data":"c869c9b88fe0ad11355187532c5b80d581a666eaeb3f70aba6bf0d417fe96e5c"} Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.805572 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smj7p" event={"ID":"7c067ef6-5957-4cfd-be96-788f4236d990","Type":"ContainerStarted","Data":"1ecbc6fc8f3a2a1b2921688dc9750a6254807f3fab2d5361e9187a1a724db3fb"} Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.823631 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4a77610a743cd66980c9bbf11f86addded2a4de14cf63efb26011d025bc1eff4"} Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.872318 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hf5q" event={"ID":"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1","Type":"ContainerStarted","Data":"b783506b0a048717376e5843155496bc27094e4adaf4a38fa6c21117e78ff56c"} Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.872391 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hf5q" event={"ID":"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1","Type":"ContainerStarted","Data":"de56d260429dddace7b2d2e604920802e32a72818b352bc9a30454fbca939ba7"} Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.889351 4757 generic.go:334] "Generic (PLEG): container finished" podID="17e402cb-44b0-4232-8671-b7db09c8e9b1" containerID="a03eb83fed81ef006a1fffecffaa596621fdb3fa92365db48fac2746ba852e87" exitCode=0 Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.889450 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krvn2" event={"ID":"17e402cb-44b0-4232-8671-b7db09c8e9b1","Type":"ContainerDied","Data":"a03eb83fed81ef006a1fffecffaa596621fdb3fa92365db48fac2746ba852e87"} Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.913372 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" event={"ID":"7e7b566f-4c89-4834-ba16-f5e5286eda7e","Type":"ContainerStarted","Data":"908b405d01bbe842bb85ecfde842a2484e2dad2c3f6729feb88eb45f09f46ed9"} Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.914076 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ae0e927-fa11-4169-aa7c-2eeac9821264-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6ae0e927-fa11-4169-aa7c-2eeac9821264\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.914112 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ae0e927-fa11-4169-aa7c-2eeac9821264-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6ae0e927-fa11-4169-aa7c-2eeac9821264\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.969473 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"82b8b0272eadfd663328abb5acf543118e0b2e4190dda9c0708a32ca17e43745"} Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.969535 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0a2efd52ceb8f74e9e72192ffb134fd1f52fc3a20329b46bd7093979d2fc64a4"} Dec 16 12:49:23 crc kubenswrapper[4757]: I1216 12:49:23.975085 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"958ee430f69ac7a8b9e21ce19d5ed91405ac1947c606f4a7d36aff627198ddf9"} Dec 16 12:49:24 crc kubenswrapper[4757]: I1216 12:49:24.021022 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ae0e927-fa11-4169-aa7c-2eeac9821264-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6ae0e927-fa11-4169-aa7c-2eeac9821264\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 12:49:24 crc kubenswrapper[4757]: I1216 12:49:24.021094 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ae0e927-fa11-4169-aa7c-2eeac9821264-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6ae0e927-fa11-4169-aa7c-2eeac9821264\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 12:49:24 crc kubenswrapper[4757]: I1216 12:49:24.021242 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ae0e927-fa11-4169-aa7c-2eeac9821264-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6ae0e927-fa11-4169-aa7c-2eeac9821264\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 12:49:24 crc kubenswrapper[4757]: I1216 12:49:24.055822 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ae0e927-fa11-4169-aa7c-2eeac9821264-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6ae0e927-fa11-4169-aa7c-2eeac9821264\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 12:49:24 crc kubenswrapper[4757]: I1216 12:49:24.106389 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 12:49:24 crc kubenswrapper[4757]: I1216 12:49:24.671568 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 16 12:49:24 crc kubenswrapper[4757]: I1216 12:49:24.683167 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:24 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:24 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:24 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:24 crc kubenswrapper[4757]: I1216 12:49:24.683252 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:25 crc kubenswrapper[4757]: I1216 12:49:25.130359 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9bdb4eff81f7017c7f05ab9c870424f33115066bcd8dacba6b6c69fc15021379"} Dec 16 12:49:25 crc kubenswrapper[4757]: I1216 12:49:25.131468 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:49:25 crc kubenswrapper[4757]: I1216 12:49:25.146321 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6ae0e927-fa11-4169-aa7c-2eeac9821264","Type":"ContainerStarted","Data":"5fb7f603c007812e08bd2024f397680b833f5bf724d50513f350655574559321"} Dec 16 12:49:25 crc kubenswrapper[4757]: I1216 12:49:25.170544 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4d38396441fc3337dfa5a4055b25f60573ab561e14afe57696008536479d4caf"} Dec 16 12:49:25 crc kubenswrapper[4757]: I1216 12:49:25.251759 4757 generic.go:334] "Generic (PLEG): container finished" podID="1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1" containerID="b783506b0a048717376e5843155496bc27094e4adaf4a38fa6c21117e78ff56c" exitCode=0 Dec 16 12:49:25 crc kubenswrapper[4757]: I1216 12:49:25.251844 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hf5q" event={"ID":"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1","Type":"ContainerDied","Data":"b783506b0a048717376e5843155496bc27094e4adaf4a38fa6c21117e78ff56c"} Dec 16 12:49:25 crc kubenswrapper[4757]: I1216 12:49:25.296561 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" event={"ID":"7e7b566f-4c89-4834-ba16-f5e5286eda7e","Type":"ContainerStarted","Data":"0e686faaf5ea9d1d25d805eddab869b5f0eef7afb7b4bfc0d41d639b92da26c5"} Dec 16 12:49:25 crc kubenswrapper[4757]: I1216 12:49:25.297236 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:25 crc kubenswrapper[4757]: I1216 12:49:25.417329 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" podStartSLOduration=131.417308203 podStartE2EDuration="2m11.417308203s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:25.416081162 +0000 UTC m=+150.843824968" watchObservedRunningTime="2025-12-16 12:49:25.417308203 +0000 UTC m=+150.845051999" Dec 16 12:49:25 crc kubenswrapper[4757]: I1216 12:49:25.675511 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:25 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:25 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:25 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:25 crc kubenswrapper[4757]: I1216 12:49:25.675611 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:26 crc kubenswrapper[4757]: I1216 12:49:26.594742 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 16 12:49:26 crc kubenswrapper[4757]: I1216 12:49:26.595729 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 12:49:26 crc kubenswrapper[4757]: I1216 12:49:26.602529 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 16 12:49:26 crc kubenswrapper[4757]: I1216 12:49:26.602733 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 16 12:49:26 crc kubenswrapper[4757]: I1216 12:49:26.614582 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 16 12:49:26 crc kubenswrapper[4757]: I1216 12:49:26.675220 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:26 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:26 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:26 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:26 crc kubenswrapper[4757]: I1216 12:49:26.675285 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:26 crc kubenswrapper[4757]: I1216 12:49:26.697991 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31e61710-9897-41fc-ab0a-2b054944d6eb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"31e61710-9897-41fc-ab0a-2b054944d6eb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 12:49:26 crc kubenswrapper[4757]: I1216 12:49:26.698084 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31e61710-9897-41fc-ab0a-2b054944d6eb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"31e61710-9897-41fc-ab0a-2b054944d6eb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 12:49:26 crc kubenswrapper[4757]: I1216 12:49:26.799062 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31e61710-9897-41fc-ab0a-2b054944d6eb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"31e61710-9897-41fc-ab0a-2b054944d6eb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 12:49:26 crc kubenswrapper[4757]: I1216 12:49:26.799188 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31e61710-9897-41fc-ab0a-2b054944d6eb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"31e61710-9897-41fc-ab0a-2b054944d6eb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 12:49:26 crc kubenswrapper[4757]: I1216 12:49:26.799269 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31e61710-9897-41fc-ab0a-2b054944d6eb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"31e61710-9897-41fc-ab0a-2b054944d6eb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 12:49:26 crc kubenswrapper[4757]: I1216 12:49:26.835334 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31e61710-9897-41fc-ab0a-2b054944d6eb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"31e61710-9897-41fc-ab0a-2b054944d6eb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 12:49:26 crc kubenswrapper[4757]: I1216 12:49:26.938478 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 12:49:27 crc kubenswrapper[4757]: I1216 12:49:27.182225 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 16 12:49:27 crc kubenswrapper[4757]: I1216 12:49:27.313876 4757 generic.go:334] "Generic (PLEG): container finished" podID="d862c66e-b538-48e8-bbcb-a0cb2715a7de" containerID="22218cbb717092852de7d52ee5e3fcb2ec09dfcb7e9a9cd1ada61446d0c8efb4" exitCode=0 Dec 16 12:49:27 crc kubenswrapper[4757]: I1216 12:49:27.313954 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr" event={"ID":"d862c66e-b538-48e8-bbcb-a0cb2715a7de","Type":"ContainerDied","Data":"22218cbb717092852de7d52ee5e3fcb2ec09dfcb7e9a9cd1ada61446d0c8efb4"} Dec 16 12:49:27 crc kubenswrapper[4757]: I1216 12:49:27.317514 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"31e61710-9897-41fc-ab0a-2b054944d6eb","Type":"ContainerStarted","Data":"c7dfe42eeed1bbee7fe93d81b421e25a770c8694431fc168be5c843e4d960f43"} Dec 16 12:49:27 crc kubenswrapper[4757]: I1216 12:49:27.321736 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6ae0e927-fa11-4169-aa7c-2eeac9821264","Type":"ContainerStarted","Data":"743e4c9f413ea3901dc31b12e39ceaaea61c8db976a22f4080081091d41be015"} Dec 16 12:49:27 crc kubenswrapper[4757]: I1216 12:49:27.347107 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.347085845 podStartE2EDuration="4.347085845s" podCreationTimestamp="2025-12-16 12:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:27.346593363 +0000 UTC m=+152.774337159" watchObservedRunningTime="2025-12-16 12:49:27.347085845 +0000 UTC m=+152.774829651" Dec 16 12:49:27 crc kubenswrapper[4757]: I1216 12:49:27.375504 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vdk49" Dec 16 12:49:27 crc kubenswrapper[4757]: I1216 12:49:27.536202 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:27 crc kubenswrapper[4757]: I1216 12:49:27.545469 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-jv58r" Dec 16 12:49:27 crc kubenswrapper[4757]: I1216 12:49:27.679119 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:27 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:27 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:27 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:27 crc kubenswrapper[4757]: I1216 12:49:27.679196 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:28 crc kubenswrapper[4757]: I1216 12:49:28.346958 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"31e61710-9897-41fc-ab0a-2b054944d6eb","Type":"ContainerStarted","Data":"6235bc81faac4f59c7070d5049544e50cdefda1e46ff6ad67ec42c0b69ba920d"} Dec 16 12:49:28 crc kubenswrapper[4757]: I1216 12:49:28.364185 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.364168294 podStartE2EDuration="2.364168294s" podCreationTimestamp="2025-12-16 12:49:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:49:28.360349318 +0000 UTC m=+153.788093114" watchObservedRunningTime="2025-12-16 12:49:28.364168294 +0000 UTC m=+153.791912090" Dec 16 12:49:28 crc kubenswrapper[4757]: I1216 12:49:28.376359 4757 generic.go:334] "Generic (PLEG): container finished" podID="6ae0e927-fa11-4169-aa7c-2eeac9821264" containerID="743e4c9f413ea3901dc31b12e39ceaaea61c8db976a22f4080081091d41be015" exitCode=0 Dec 16 12:49:28 crc kubenswrapper[4757]: I1216 12:49:28.376873 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6ae0e927-fa11-4169-aa7c-2eeac9821264","Type":"ContainerDied","Data":"743e4c9f413ea3901dc31b12e39ceaaea61c8db976a22f4080081091d41be015"} Dec 16 12:49:28 crc kubenswrapper[4757]: I1216 12:49:28.677193 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:28 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:28 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:28 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:28 crc kubenswrapper[4757]: I1216 12:49:28.677452 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:28 crc kubenswrapper[4757]: I1216 12:49:28.746341 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr" Dec 16 12:49:28 crc kubenswrapper[4757]: I1216 12:49:28.830495 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjfvn\" (UniqueName: \"kubernetes.io/projected/d862c66e-b538-48e8-bbcb-a0cb2715a7de-kube-api-access-xjfvn\") pod \"d862c66e-b538-48e8-bbcb-a0cb2715a7de\" (UID: \"d862c66e-b538-48e8-bbcb-a0cb2715a7de\") " Dec 16 12:49:28 crc kubenswrapper[4757]: I1216 12:49:28.830590 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d862c66e-b538-48e8-bbcb-a0cb2715a7de-config-volume\") pod \"d862c66e-b538-48e8-bbcb-a0cb2715a7de\" (UID: \"d862c66e-b538-48e8-bbcb-a0cb2715a7de\") " Dec 16 12:49:28 crc kubenswrapper[4757]: I1216 12:49:28.830674 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d862c66e-b538-48e8-bbcb-a0cb2715a7de-secret-volume\") pod \"d862c66e-b538-48e8-bbcb-a0cb2715a7de\" (UID: \"d862c66e-b538-48e8-bbcb-a0cb2715a7de\") " Dec 16 12:49:28 crc kubenswrapper[4757]: I1216 12:49:28.832686 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d862c66e-b538-48e8-bbcb-a0cb2715a7de-config-volume" (OuterVolumeSpecName: "config-volume") pod "d862c66e-b538-48e8-bbcb-a0cb2715a7de" (UID: "d862c66e-b538-48e8-bbcb-a0cb2715a7de"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:49:28 crc kubenswrapper[4757]: I1216 12:49:28.852535 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d862c66e-b538-48e8-bbcb-a0cb2715a7de-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d862c66e-b538-48e8-bbcb-a0cb2715a7de" (UID: "d862c66e-b538-48e8-bbcb-a0cb2715a7de"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:49:28 crc kubenswrapper[4757]: I1216 12:49:28.855370 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d862c66e-b538-48e8-bbcb-a0cb2715a7de-kube-api-access-xjfvn" (OuterVolumeSpecName: "kube-api-access-xjfvn") pod "d862c66e-b538-48e8-bbcb-a0cb2715a7de" (UID: "d862c66e-b538-48e8-bbcb-a0cb2715a7de"). InnerVolumeSpecName "kube-api-access-xjfvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:49:28 crc kubenswrapper[4757]: I1216 12:49:28.932132 4757 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d862c66e-b538-48e8-bbcb-a0cb2715a7de-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 12:49:28 crc kubenswrapper[4757]: I1216 12:49:28.932171 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjfvn\" (UniqueName: \"kubernetes.io/projected/d862c66e-b538-48e8-bbcb-a0cb2715a7de-kube-api-access-xjfvn\") on node \"crc\" DevicePath \"\"" Dec 16 12:49:28 crc kubenswrapper[4757]: I1216 12:49:28.932184 4757 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d862c66e-b538-48e8-bbcb-a0cb2715a7de-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 12:49:29 crc kubenswrapper[4757]: I1216 12:49:29.396748 4757 generic.go:334] "Generic (PLEG): container finished" podID="31e61710-9897-41fc-ab0a-2b054944d6eb" containerID="6235bc81faac4f59c7070d5049544e50cdefda1e46ff6ad67ec42c0b69ba920d" exitCode=0 Dec 16 12:49:29 crc kubenswrapper[4757]: I1216 12:49:29.396839 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"31e61710-9897-41fc-ab0a-2b054944d6eb","Type":"ContainerDied","Data":"6235bc81faac4f59c7070d5049544e50cdefda1e46ff6ad67ec42c0b69ba920d"} Dec 16 12:49:29 crc kubenswrapper[4757]: I1216 12:49:29.431598 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr" Dec 16 12:49:29 crc kubenswrapper[4757]: I1216 12:49:29.431876 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr" event={"ID":"d862c66e-b538-48e8-bbcb-a0cb2715a7de","Type":"ContainerDied","Data":"67d748f91a93e7003c15b9bb0b423843a396f7b322f39e4232d95dbf53942cfb"} Dec 16 12:49:29 crc kubenswrapper[4757]: I1216 12:49:29.431943 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67d748f91a93e7003c15b9bb0b423843a396f7b322f39e4232d95dbf53942cfb" Dec 16 12:49:29 crc kubenswrapper[4757]: I1216 12:49:29.674371 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:29 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:29 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:29 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:29 crc kubenswrapper[4757]: I1216 12:49:29.674774 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:29 crc kubenswrapper[4757]: I1216 12:49:29.797786 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 12:49:29 crc kubenswrapper[4757]: I1216 12:49:29.863418 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ae0e927-fa11-4169-aa7c-2eeac9821264-kube-api-access\") pod \"6ae0e927-fa11-4169-aa7c-2eeac9821264\" (UID: \"6ae0e927-fa11-4169-aa7c-2eeac9821264\") " Dec 16 12:49:29 crc kubenswrapper[4757]: I1216 12:49:29.863509 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ae0e927-fa11-4169-aa7c-2eeac9821264-kubelet-dir\") pod \"6ae0e927-fa11-4169-aa7c-2eeac9821264\" (UID: \"6ae0e927-fa11-4169-aa7c-2eeac9821264\") " Dec 16 12:49:29 crc kubenswrapper[4757]: I1216 12:49:29.863790 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ae0e927-fa11-4169-aa7c-2eeac9821264-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6ae0e927-fa11-4169-aa7c-2eeac9821264" (UID: "6ae0e927-fa11-4169-aa7c-2eeac9821264"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:49:29 crc kubenswrapper[4757]: I1216 12:49:29.869657 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ae0e927-fa11-4169-aa7c-2eeac9821264-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6ae0e927-fa11-4169-aa7c-2eeac9821264" (UID: "6ae0e927-fa11-4169-aa7c-2eeac9821264"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:49:29 crc kubenswrapper[4757]: I1216 12:49:29.965025 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ae0e927-fa11-4169-aa7c-2eeac9821264-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 12:49:29 crc kubenswrapper[4757]: I1216 12:49:29.965058 4757 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ae0e927-fa11-4169-aa7c-2eeac9821264-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 12:49:30 crc kubenswrapper[4757]: I1216 12:49:30.456946 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 12:49:30 crc kubenswrapper[4757]: I1216 12:49:30.457757 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6ae0e927-fa11-4169-aa7c-2eeac9821264","Type":"ContainerDied","Data":"5fb7f603c007812e08bd2024f397680b833f5bf724d50513f350655574559321"} Dec 16 12:49:30 crc kubenswrapper[4757]: I1216 12:49:30.457812 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fb7f603c007812e08bd2024f397680b833f5bf724d50513f350655574559321" Dec 16 12:49:30 crc kubenswrapper[4757]: I1216 12:49:30.675821 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:30 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:30 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:30 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:30 crc kubenswrapper[4757]: I1216 12:49:30.675870 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:30 crc kubenswrapper[4757]: I1216 12:49:30.919564 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 12:49:30 crc kubenswrapper[4757]: I1216 12:49:30.982620 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31e61710-9897-41fc-ab0a-2b054944d6eb-kubelet-dir\") pod \"31e61710-9897-41fc-ab0a-2b054944d6eb\" (UID: \"31e61710-9897-41fc-ab0a-2b054944d6eb\") " Dec 16 12:49:30 crc kubenswrapper[4757]: I1216 12:49:30.982733 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/31e61710-9897-41fc-ab0a-2b054944d6eb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "31e61710-9897-41fc-ab0a-2b054944d6eb" (UID: "31e61710-9897-41fc-ab0a-2b054944d6eb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:49:30 crc kubenswrapper[4757]: I1216 12:49:30.982798 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31e61710-9897-41fc-ab0a-2b054944d6eb-kube-api-access\") pod \"31e61710-9897-41fc-ab0a-2b054944d6eb\" (UID: \"31e61710-9897-41fc-ab0a-2b054944d6eb\") " Dec 16 12:49:30 crc kubenswrapper[4757]: I1216 12:49:30.983091 4757 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31e61710-9897-41fc-ab0a-2b054944d6eb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 12:49:30 crc kubenswrapper[4757]: I1216 12:49:30.985797 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e61710-9897-41fc-ab0a-2b054944d6eb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "31e61710-9897-41fc-ab0a-2b054944d6eb" (UID: "31e61710-9897-41fc-ab0a-2b054944d6eb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:49:31 crc kubenswrapper[4757]: I1216 12:49:31.024041 4757 patch_prober.go:28] interesting pod/console-f9d7485db-zlc9d container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Dec 16 12:49:31 crc kubenswrapper[4757]: I1216 12:49:31.024636 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-zlc9d" podUID="99d216d0-ff66-4cbe-a5ac-c2ca6a41f920" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Dec 16 12:49:31 crc kubenswrapper[4757]: I1216 12:49:31.042763 4757 patch_prober.go:28] interesting pod/downloads-7954f5f757-4hxkq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 16 12:49:31 crc kubenswrapper[4757]: I1216 12:49:31.042780 4757 patch_prober.go:28] interesting pod/downloads-7954f5f757-4hxkq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 16 12:49:31 crc kubenswrapper[4757]: I1216 12:49:31.042825 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4hxkq" podUID="39a3c195-3130-4fbb-903e-1ac8ab630ced" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 16 12:49:31 crc kubenswrapper[4757]: I1216 12:49:31.042833 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-4hxkq" podUID="39a3c195-3130-4fbb-903e-1ac8ab630ced" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 16 12:49:31 crc kubenswrapper[4757]: I1216 12:49:31.084770 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31e61710-9897-41fc-ab0a-2b054944d6eb-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 12:49:31 crc kubenswrapper[4757]: I1216 12:49:31.506885 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"31e61710-9897-41fc-ab0a-2b054944d6eb","Type":"ContainerDied","Data":"c7dfe42eeed1bbee7fe93d81b421e25a770c8694431fc168be5c843e4d960f43"} Dec 16 12:49:31 crc kubenswrapper[4757]: I1216 12:49:31.506928 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7dfe42eeed1bbee7fe93d81b421e25a770c8694431fc168be5c843e4d960f43" Dec 16 12:49:31 crc kubenswrapper[4757]: I1216 12:49:31.507065 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 12:49:31 crc kubenswrapper[4757]: I1216 12:49:31.675669 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:31 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:31 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:31 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:31 crc kubenswrapper[4757]: I1216 12:49:31.675734 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:32 crc kubenswrapper[4757]: I1216 12:49:32.675565 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:32 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:32 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:32 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:32 crc kubenswrapper[4757]: I1216 12:49:32.675616 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:33 crc kubenswrapper[4757]: I1216 12:49:33.679710 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:33 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:33 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:33 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:33 crc kubenswrapper[4757]: I1216 12:49:33.680098 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:34 crc kubenswrapper[4757]: I1216 12:49:34.675742 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:34 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:34 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:34 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:34 crc kubenswrapper[4757]: I1216 12:49:34.675969 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:35 crc kubenswrapper[4757]: I1216 12:49:35.675603 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:35 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:35 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:35 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:35 crc kubenswrapper[4757]: I1216 12:49:35.675654 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:36 crc kubenswrapper[4757]: I1216 12:49:36.675574 4757 patch_prober.go:28] interesting pod/router-default-5444994796-ggbw2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 12:49:36 crc kubenswrapper[4757]: [-]has-synced failed: reason withheld Dec 16 12:49:36 crc kubenswrapper[4757]: [+]process-running ok Dec 16 12:49:36 crc kubenswrapper[4757]: healthz check failed Dec 16 12:49:36 crc kubenswrapper[4757]: I1216 12:49:36.676466 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ggbw2" podUID="0497d8a5-1e85-4989-8433-6b410d8f5427" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 12:49:36 crc kubenswrapper[4757]: I1216 12:49:36.794427 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs\") pod \"network-metrics-daemon-k6rww\" (UID: \"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\") " pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:49:36 crc kubenswrapper[4757]: I1216 12:49:36.800643 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c1b0cca-3853-4bcf-8389-2fa9c754b5e8-metrics-certs\") pod \"network-metrics-daemon-k6rww\" (UID: \"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8\") " pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:49:36 crc kubenswrapper[4757]: I1216 12:49:36.983782 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k6rww" Dec 16 12:49:37 crc kubenswrapper[4757]: I1216 12:49:37.675954 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:37 crc kubenswrapper[4757]: I1216 12:49:37.680483 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-ggbw2" Dec 16 12:49:41 crc kubenswrapper[4757]: I1216 12:49:41.023800 4757 patch_prober.go:28] interesting pod/console-f9d7485db-zlc9d container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Dec 16 12:49:41 crc kubenswrapper[4757]: I1216 12:49:41.024149 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-zlc9d" podUID="99d216d0-ff66-4cbe-a5ac-c2ca6a41f920" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Dec 16 12:49:41 crc kubenswrapper[4757]: I1216 12:49:41.045658 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-4hxkq" Dec 16 12:49:41 crc kubenswrapper[4757]: I1216 12:49:41.530183 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:49:51 crc kubenswrapper[4757]: I1216 12:49:51.028496 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:51 crc kubenswrapper[4757]: I1216 12:49:51.033813 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 12:49:51 crc kubenswrapper[4757]: I1216 12:49:51.180980 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 12:49:51 crc kubenswrapper[4757]: I1216 12:49:51.181089 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 12:49:52 crc kubenswrapper[4757]: I1216 12:49:52.607497 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lkb7m" Dec 16 12:49:52 crc kubenswrapper[4757]: E1216 12:49:52.939670 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 16 12:49:52 crc kubenswrapper[4757]: E1216 12:49:52.939857 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q7q6n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-bbrff_openshift-marketplace(ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 12:49:52 crc kubenswrapper[4757]: E1216 12:49:52.941227 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-bbrff" podUID="ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7" Dec 16 12:49:54 crc kubenswrapper[4757]: E1216 12:49:54.541719 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-bbrff" podUID="ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7" Dec 16 12:50:00 crc kubenswrapper[4757]: E1216 12:50:00.860796 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 16 12:50:00 crc kubenswrapper[4757]: E1216 12:50:00.861305 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xx48j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-smj7p_openshift-marketplace(7c067ef6-5957-4cfd-be96-788f4236d990): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 12:50:00 crc kubenswrapper[4757]: E1216 12:50:00.863345 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-smj7p" podUID="7c067ef6-5957-4cfd-be96-788f4236d990" Dec 16 12:50:01 crc kubenswrapper[4757]: I1216 12:50:01.117279 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k6rww"] Dec 16 12:50:01 crc kubenswrapper[4757]: W1216 12:50:01.122516 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c1b0cca_3853_4bcf_8389_2fa9c754b5e8.slice/crio-d8a0cfd2ea28d2083243cd4965065614e147c36f7a2f19bfdb73a2f3f089593d WatchSource:0}: Error finding container d8a0cfd2ea28d2083243cd4965065614e147c36f7a2f19bfdb73a2f3f089593d: Status 404 returned error can't find the container with id d8a0cfd2ea28d2083243cd4965065614e147c36f7a2f19bfdb73a2f3f089593d Dec 16 12:50:01 crc kubenswrapper[4757]: I1216 12:50:01.384614 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 12:50:01 crc kubenswrapper[4757]: I1216 12:50:01.736707 4757 generic.go:334] "Generic (PLEG): container finished" podID="c8ab79c2-762d-4773-ae6e-6e92acdf4508" containerID="adc7ebc29e7848aac8c71c8162e81915ca4e568fc87a1bad8ceae8bc5a8f4e5b" exitCode=0 Dec 16 12:50:01 crc kubenswrapper[4757]: I1216 12:50:01.736783 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ppdm" event={"ID":"c8ab79c2-762d-4773-ae6e-6e92acdf4508","Type":"ContainerDied","Data":"adc7ebc29e7848aac8c71c8162e81915ca4e568fc87a1bad8ceae8bc5a8f4e5b"} Dec 16 12:50:01 crc kubenswrapper[4757]: I1216 12:50:01.743615 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k6rww" event={"ID":"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8","Type":"ContainerStarted","Data":"468a2104e878254861534295f918e514d79f74a8ebd31a05eb958d82b115ad31"} Dec 16 12:50:01 crc kubenswrapper[4757]: I1216 12:50:01.743678 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k6rww" event={"ID":"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8","Type":"ContainerStarted","Data":"d8a0cfd2ea28d2083243cd4965065614e147c36f7a2f19bfdb73a2f3f089593d"} Dec 16 12:50:01 crc kubenswrapper[4757]: I1216 12:50:01.746748 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hf5q" event={"ID":"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1","Type":"ContainerStarted","Data":"97aaa2c6ef19922ca90d590d602f83799273cf6856492273376b927dbd970219"} Dec 16 12:50:01 crc kubenswrapper[4757]: I1216 12:50:01.749133 4757 generic.go:334] "Generic (PLEG): container finished" podID="17e402cb-44b0-4232-8671-b7db09c8e9b1" containerID="3513080f792e82d2d4326182c95754a9b4946c172f9c0cf5cc1e58d2db5985b6" exitCode=0 Dec 16 12:50:01 crc kubenswrapper[4757]: I1216 12:50:01.749210 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krvn2" event={"ID":"17e402cb-44b0-4232-8671-b7db09c8e9b1","Type":"ContainerDied","Data":"3513080f792e82d2d4326182c95754a9b4946c172f9c0cf5cc1e58d2db5985b6"} Dec 16 12:50:01 crc kubenswrapper[4757]: I1216 12:50:01.753428 4757 generic.go:334] "Generic (PLEG): container finished" podID="7320d121-c9e6-4af2-ad14-4db89ea38a9e" containerID="f8b4f86cc95a865e67e1e135bd86d1a0ee82445fb640122f00c7a758d8e79522" exitCode=0 Dec 16 12:50:01 crc kubenswrapper[4757]: I1216 12:50:01.753511 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqtlh" event={"ID":"7320d121-c9e6-4af2-ad14-4db89ea38a9e","Type":"ContainerDied","Data":"f8b4f86cc95a865e67e1e135bd86d1a0ee82445fb640122f00c7a758d8e79522"} Dec 16 12:50:01 crc kubenswrapper[4757]: I1216 12:50:01.766175 4757 generic.go:334] "Generic (PLEG): container finished" podID="8bca5421-7190-4386-9a4d-fc01e88be52e" containerID="85bb4b97b11ad53163f8952d48b4cd0711d53a4b4c89bbda058f326a7e32a969" exitCode=0 Dec 16 12:50:01 crc kubenswrapper[4757]: I1216 12:50:01.766234 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srvcx" event={"ID":"8bca5421-7190-4386-9a4d-fc01e88be52e","Type":"ContainerDied","Data":"85bb4b97b11ad53163f8952d48b4cd0711d53a4b4c89bbda058f326a7e32a969"} Dec 16 12:50:01 crc kubenswrapper[4757]: I1216 12:50:01.774694 4757 generic.go:334] "Generic (PLEG): container finished" podID="2a2e530d-95e8-4bdc-8710-b08bb4d99f17" containerID="131416170b56c1778f99b2e5cc6020cf90ea57113640c3270b22cfe17672e172" exitCode=0 Dec 16 12:50:01 crc kubenswrapper[4757]: I1216 12:50:01.775503 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4dkc" event={"ID":"2a2e530d-95e8-4bdc-8710-b08bb4d99f17","Type":"ContainerDied","Data":"131416170b56c1778f99b2e5cc6020cf90ea57113640c3270b22cfe17672e172"} Dec 16 12:50:01 crc kubenswrapper[4757]: E1216 12:50:01.783309 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-smj7p" podUID="7c067ef6-5957-4cfd-be96-788f4236d990" Dec 16 12:50:02 crc kubenswrapper[4757]: I1216 12:50:02.783078 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k6rww" event={"ID":"0c1b0cca-3853-4bcf-8389-2fa9c754b5e8","Type":"ContainerStarted","Data":"371e8db5925cd0d87a95e780732f1865fa8b1405b5f422f026b478067f3d21a3"} Dec 16 12:50:02 crc kubenswrapper[4757]: I1216 12:50:02.798415 4757 generic.go:334] "Generic (PLEG): container finished" podID="1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1" containerID="97aaa2c6ef19922ca90d590d602f83799273cf6856492273376b927dbd970219" exitCode=0 Dec 16 12:50:02 crc kubenswrapper[4757]: I1216 12:50:02.798471 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hf5q" event={"ID":"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1","Type":"ContainerDied","Data":"97aaa2c6ef19922ca90d590d602f83799273cf6856492273376b927dbd970219"} Dec 16 12:50:02 crc kubenswrapper[4757]: I1216 12:50:02.818817 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-k6rww" podStartSLOduration=168.818796407 podStartE2EDuration="2m48.818796407s" podCreationTimestamp="2025-12-16 12:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:50:02.816330925 +0000 UTC m=+188.244074731" watchObservedRunningTime="2025-12-16 12:50:02.818796407 +0000 UTC m=+188.246540203" Dec 16 12:50:03 crc kubenswrapper[4757]: I1216 12:50:03.144719 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 16 12:50:03 crc kubenswrapper[4757]: E1216 12:50:03.144917 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d862c66e-b538-48e8-bbcb-a0cb2715a7de" containerName="collect-profiles" Dec 16 12:50:03 crc kubenswrapper[4757]: I1216 12:50:03.144928 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d862c66e-b538-48e8-bbcb-a0cb2715a7de" containerName="collect-profiles" Dec 16 12:50:03 crc kubenswrapper[4757]: E1216 12:50:03.144944 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae0e927-fa11-4169-aa7c-2eeac9821264" containerName="pruner" Dec 16 12:50:03 crc kubenswrapper[4757]: I1216 12:50:03.144949 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae0e927-fa11-4169-aa7c-2eeac9821264" containerName="pruner" Dec 16 12:50:03 crc kubenswrapper[4757]: E1216 12:50:03.144956 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e61710-9897-41fc-ab0a-2b054944d6eb" containerName="pruner" Dec 16 12:50:03 crc kubenswrapper[4757]: I1216 12:50:03.144962 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e61710-9897-41fc-ab0a-2b054944d6eb" containerName="pruner" Dec 16 12:50:03 crc kubenswrapper[4757]: I1216 12:50:03.145079 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="31e61710-9897-41fc-ab0a-2b054944d6eb" containerName="pruner" Dec 16 12:50:03 crc kubenswrapper[4757]: I1216 12:50:03.145093 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ae0e927-fa11-4169-aa7c-2eeac9821264" containerName="pruner" Dec 16 12:50:03 crc kubenswrapper[4757]: I1216 12:50:03.145102 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="d862c66e-b538-48e8-bbcb-a0cb2715a7de" containerName="collect-profiles" Dec 16 12:50:03 crc kubenswrapper[4757]: I1216 12:50:03.145413 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 12:50:03 crc kubenswrapper[4757]: I1216 12:50:03.148073 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 16 12:50:03 crc kubenswrapper[4757]: I1216 12:50:03.148360 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 16 12:50:03 crc kubenswrapper[4757]: I1216 12:50:03.158766 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 16 12:50:03 crc kubenswrapper[4757]: I1216 12:50:03.238387 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/041b8f25-9797-41d0-9ae1-cb2a86f9c5a9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"041b8f25-9797-41d0-9ae1-cb2a86f9c5a9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 12:50:03 crc kubenswrapper[4757]: I1216 12:50:03.238463 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/041b8f25-9797-41d0-9ae1-cb2a86f9c5a9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"041b8f25-9797-41d0-9ae1-cb2a86f9c5a9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 12:50:03 crc kubenswrapper[4757]: I1216 12:50:03.339612 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/041b8f25-9797-41d0-9ae1-cb2a86f9c5a9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"041b8f25-9797-41d0-9ae1-cb2a86f9c5a9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 12:50:03 crc kubenswrapper[4757]: I1216 12:50:03.339707 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/041b8f25-9797-41d0-9ae1-cb2a86f9c5a9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"041b8f25-9797-41d0-9ae1-cb2a86f9c5a9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 12:50:03 crc kubenswrapper[4757]: I1216 12:50:03.339767 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/041b8f25-9797-41d0-9ae1-cb2a86f9c5a9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"041b8f25-9797-41d0-9ae1-cb2a86f9c5a9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 12:50:03 crc kubenswrapper[4757]: I1216 12:50:03.367682 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/041b8f25-9797-41d0-9ae1-cb2a86f9c5a9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"041b8f25-9797-41d0-9ae1-cb2a86f9c5a9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 12:50:03 crc kubenswrapper[4757]: I1216 12:50:03.474180 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 12:50:04 crc kubenswrapper[4757]: I1216 12:50:04.541932 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 16 12:50:04 crc kubenswrapper[4757]: I1216 12:50:04.809988 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"041b8f25-9797-41d0-9ae1-cb2a86f9c5a9","Type":"ContainerStarted","Data":"83a78c05363f45eb8faf8436561fb26d15444f8189dc47ee93d8118b677146fb"} Dec 16 12:50:04 crc kubenswrapper[4757]: I1216 12:50:04.813385 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srvcx" event={"ID":"8bca5421-7190-4386-9a4d-fc01e88be52e","Type":"ContainerStarted","Data":"2708d42d36db97fb526b75fbb8863e35dcebbd18fa0878ddc9b221b35fb53229"} Dec 16 12:50:04 crc kubenswrapper[4757]: I1216 12:50:04.834332 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-srvcx" podStartSLOduration=4.368231132 podStartE2EDuration="46.834315187s" podCreationTimestamp="2025-12-16 12:49:18 +0000 UTC" firstStartedPulling="2025-12-16 12:49:21.689267893 +0000 UTC m=+147.117011689" lastFinishedPulling="2025-12-16 12:50:04.155351948 +0000 UTC m=+189.583095744" observedRunningTime="2025-12-16 12:50:04.831489346 +0000 UTC m=+190.259233142" watchObservedRunningTime="2025-12-16 12:50:04.834315187 +0000 UTC m=+190.262058983" Dec 16 12:50:06 crc kubenswrapper[4757]: I1216 12:50:06.826444 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hf5q" event={"ID":"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1","Type":"ContainerStarted","Data":"4a32b065efcba5517738170671deaea43f8285660498613e20fa0589928f82f3"} Dec 16 12:50:07 crc kubenswrapper[4757]: I1216 12:50:07.750750 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 16 12:50:07 crc kubenswrapper[4757]: I1216 12:50:07.751591 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 12:50:07 crc kubenswrapper[4757]: I1216 12:50:07.760020 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 16 12:50:07 crc kubenswrapper[4757]: I1216 12:50:07.790863 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18-var-lock\") pod \"installer-9-crc\" (UID: \"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 12:50:07 crc kubenswrapper[4757]: I1216 12:50:07.790935 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 12:50:07 crc kubenswrapper[4757]: I1216 12:50:07.791043 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18-kube-api-access\") pod \"installer-9-crc\" (UID: \"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 12:50:07 crc kubenswrapper[4757]: I1216 12:50:07.831368 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"041b8f25-9797-41d0-9ae1-cb2a86f9c5a9","Type":"ContainerStarted","Data":"5e9b8b5825de69bb8b48e5a53d3635bdf9940feae4165bcfc78c1f04da968293"} Dec 16 12:50:07 crc kubenswrapper[4757]: I1216 12:50:07.892527 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18-kube-api-access\") pod \"installer-9-crc\" (UID: \"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 12:50:07 crc kubenswrapper[4757]: I1216 12:50:07.892578 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18-var-lock\") pod \"installer-9-crc\" (UID: \"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 12:50:07 crc kubenswrapper[4757]: I1216 12:50:07.892610 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 12:50:07 crc kubenswrapper[4757]: I1216 12:50:07.892671 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 12:50:07 crc kubenswrapper[4757]: I1216 12:50:07.892969 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18-var-lock\") pod \"installer-9-crc\" (UID: \"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 12:50:07 crc kubenswrapper[4757]: I1216 12:50:07.920767 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18-kube-api-access\") pod \"installer-9-crc\" (UID: \"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 12:50:08 crc kubenswrapper[4757]: I1216 12:50:08.075939 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 12:50:08 crc kubenswrapper[4757]: I1216 12:50:08.816213 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-srvcx" Dec 16 12:50:08 crc kubenswrapper[4757]: I1216 12:50:08.816268 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-srvcx" Dec 16 12:50:08 crc kubenswrapper[4757]: I1216 12:50:08.838787 4757 generic.go:334] "Generic (PLEG): container finished" podID="041b8f25-9797-41d0-9ae1-cb2a86f9c5a9" containerID="5e9b8b5825de69bb8b48e5a53d3635bdf9940feae4165bcfc78c1f04da968293" exitCode=0 Dec 16 12:50:08 crc kubenswrapper[4757]: I1216 12:50:08.838903 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"041b8f25-9797-41d0-9ae1-cb2a86f9c5a9","Type":"ContainerDied","Data":"5e9b8b5825de69bb8b48e5a53d3635bdf9940feae4165bcfc78c1f04da968293"} Dec 16 12:50:08 crc kubenswrapper[4757]: I1216 12:50:08.859819 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2hf5q" podStartSLOduration=7.436519636 podStartE2EDuration="47.859800688s" podCreationTimestamp="2025-12-16 12:49:21 +0000 UTC" firstStartedPulling="2025-12-16 12:49:25.270462914 +0000 UTC m=+150.698206710" lastFinishedPulling="2025-12-16 12:50:05.693743966 +0000 UTC m=+191.121487762" observedRunningTime="2025-12-16 12:50:08.859104821 +0000 UTC m=+194.286848607" watchObservedRunningTime="2025-12-16 12:50:08.859800688 +0000 UTC m=+194.287544484" Dec 16 12:50:09 crc kubenswrapper[4757]: I1216 12:50:09.082588 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-srvcx" Dec 16 12:50:09 crc kubenswrapper[4757]: I1216 12:50:09.148974 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-srvcx" Dec 16 12:50:09 crc kubenswrapper[4757]: I1216 12:50:09.421256 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 16 12:50:09 crc kubenswrapper[4757]: I1216 12:50:09.774605 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wzkwh"] Dec 16 12:50:09 crc kubenswrapper[4757]: I1216 12:50:09.844282 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18","Type":"ContainerStarted","Data":"eb6c0222c0b37dd81e65b033bf3cfe2a06ce581f75050fa634f12915808433b7"} Dec 16 12:50:10 crc kubenswrapper[4757]: I1216 12:50:10.205780 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 12:50:10 crc kubenswrapper[4757]: I1216 12:50:10.229374 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/041b8f25-9797-41d0-9ae1-cb2a86f9c5a9-kube-api-access\") pod \"041b8f25-9797-41d0-9ae1-cb2a86f9c5a9\" (UID: \"041b8f25-9797-41d0-9ae1-cb2a86f9c5a9\") " Dec 16 12:50:10 crc kubenswrapper[4757]: I1216 12:50:10.229462 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/041b8f25-9797-41d0-9ae1-cb2a86f9c5a9-kubelet-dir\") pod \"041b8f25-9797-41d0-9ae1-cb2a86f9c5a9\" (UID: \"041b8f25-9797-41d0-9ae1-cb2a86f9c5a9\") " Dec 16 12:50:10 crc kubenswrapper[4757]: I1216 12:50:10.229604 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/041b8f25-9797-41d0-9ae1-cb2a86f9c5a9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "041b8f25-9797-41d0-9ae1-cb2a86f9c5a9" (UID: "041b8f25-9797-41d0-9ae1-cb2a86f9c5a9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:50:10 crc kubenswrapper[4757]: I1216 12:50:10.229761 4757 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/041b8f25-9797-41d0-9ae1-cb2a86f9c5a9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:10 crc kubenswrapper[4757]: I1216 12:50:10.234495 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/041b8f25-9797-41d0-9ae1-cb2a86f9c5a9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "041b8f25-9797-41d0-9ae1-cb2a86f9c5a9" (UID: "041b8f25-9797-41d0-9ae1-cb2a86f9c5a9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:50:10 crc kubenswrapper[4757]: I1216 12:50:10.330876 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/041b8f25-9797-41d0-9ae1-cb2a86f9c5a9-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:10 crc kubenswrapper[4757]: I1216 12:50:10.855048 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4dkc" event={"ID":"2a2e530d-95e8-4bdc-8710-b08bb4d99f17","Type":"ContainerStarted","Data":"15831ddb2ce6723715048119347d8ac55268f5360472f12929b25aea9bff7522"} Dec 16 12:50:10 crc kubenswrapper[4757]: I1216 12:50:10.857877 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18","Type":"ContainerStarted","Data":"30350ecdc40e5317b6e3b511899e7a3ea245816a20f062ef69b715f4383cff03"} Dec 16 12:50:10 crc kubenswrapper[4757]: I1216 12:50:10.860252 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"041b8f25-9797-41d0-9ae1-cb2a86f9c5a9","Type":"ContainerDied","Data":"83a78c05363f45eb8faf8436561fb26d15444f8189dc47ee93d8118b677146fb"} Dec 16 12:50:10 crc kubenswrapper[4757]: I1216 12:50:10.860279 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83a78c05363f45eb8faf8436561fb26d15444f8189dc47ee93d8118b677146fb" Dec 16 12:50:10 crc kubenswrapper[4757]: I1216 12:50:10.860332 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 12:50:10 crc kubenswrapper[4757]: I1216 12:50:10.879175 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x4dkc" podStartSLOduration=5.797598709 podStartE2EDuration="50.879161754s" podCreationTimestamp="2025-12-16 12:49:20 +0000 UTC" firstStartedPulling="2025-12-16 12:49:23.790395508 +0000 UTC m=+149.218139304" lastFinishedPulling="2025-12-16 12:50:08.871958553 +0000 UTC m=+194.299702349" observedRunningTime="2025-12-16 12:50:10.876916408 +0000 UTC m=+196.304660194" watchObservedRunningTime="2025-12-16 12:50:10.879161754 +0000 UTC m=+196.306905540" Dec 16 12:50:10 crc kubenswrapper[4757]: I1216 12:50:10.939979 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.939957547 podStartE2EDuration="3.939957547s" podCreationTimestamp="2025-12-16 12:50:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:50:10.936624134 +0000 UTC m=+196.364367930" watchObservedRunningTime="2025-12-16 12:50:10.939957547 +0000 UTC m=+196.367701343" Dec 16 12:50:10 crc kubenswrapper[4757]: I1216 12:50:10.971615 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-srvcx"] Dec 16 12:50:10 crc kubenswrapper[4757]: I1216 12:50:10.972052 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-srvcx" podUID="8bca5421-7190-4386-9a4d-fc01e88be52e" containerName="registry-server" containerID="cri-o://2708d42d36db97fb526b75fbb8863e35dcebbd18fa0878ddc9b221b35fb53229" gracePeriod=2 Dec 16 12:50:11 crc kubenswrapper[4757]: I1216 12:50:11.636174 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2hf5q" Dec 16 12:50:11 crc kubenswrapper[4757]: I1216 12:50:11.636235 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2hf5q" Dec 16 12:50:11 crc kubenswrapper[4757]: I1216 12:50:11.868500 4757 generic.go:334] "Generic (PLEG): container finished" podID="8bca5421-7190-4386-9a4d-fc01e88be52e" containerID="2708d42d36db97fb526b75fbb8863e35dcebbd18fa0878ddc9b221b35fb53229" exitCode=0 Dec 16 12:50:11 crc kubenswrapper[4757]: I1216 12:50:11.868582 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srvcx" event={"ID":"8bca5421-7190-4386-9a4d-fc01e88be52e","Type":"ContainerDied","Data":"2708d42d36db97fb526b75fbb8863e35dcebbd18fa0878ddc9b221b35fb53229"} Dec 16 12:50:12 crc kubenswrapper[4757]: I1216 12:50:12.612531 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srvcx" Dec 16 12:50:12 crc kubenswrapper[4757]: I1216 12:50:12.665045 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bca5421-7190-4386-9a4d-fc01e88be52e-utilities\") pod \"8bca5421-7190-4386-9a4d-fc01e88be52e\" (UID: \"8bca5421-7190-4386-9a4d-fc01e88be52e\") " Dec 16 12:50:12 crc kubenswrapper[4757]: I1216 12:50:12.665126 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bca5421-7190-4386-9a4d-fc01e88be52e-catalog-content\") pod \"8bca5421-7190-4386-9a4d-fc01e88be52e\" (UID: \"8bca5421-7190-4386-9a4d-fc01e88be52e\") " Dec 16 12:50:12 crc kubenswrapper[4757]: I1216 12:50:12.665203 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgw9n\" (UniqueName: \"kubernetes.io/projected/8bca5421-7190-4386-9a4d-fc01e88be52e-kube-api-access-zgw9n\") pod \"8bca5421-7190-4386-9a4d-fc01e88be52e\" (UID: \"8bca5421-7190-4386-9a4d-fc01e88be52e\") " Dec 16 12:50:12 crc kubenswrapper[4757]: I1216 12:50:12.666208 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bca5421-7190-4386-9a4d-fc01e88be52e-utilities" (OuterVolumeSpecName: "utilities") pod "8bca5421-7190-4386-9a4d-fc01e88be52e" (UID: "8bca5421-7190-4386-9a4d-fc01e88be52e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:50:12 crc kubenswrapper[4757]: I1216 12:50:12.671879 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bca5421-7190-4386-9a4d-fc01e88be52e-kube-api-access-zgw9n" (OuterVolumeSpecName: "kube-api-access-zgw9n") pod "8bca5421-7190-4386-9a4d-fc01e88be52e" (UID: "8bca5421-7190-4386-9a4d-fc01e88be52e"). InnerVolumeSpecName "kube-api-access-zgw9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:50:12 crc kubenswrapper[4757]: I1216 12:50:12.683805 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2hf5q" podUID="1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1" containerName="registry-server" probeResult="failure" output=< Dec 16 12:50:12 crc kubenswrapper[4757]: timeout: failed to connect service ":50051" within 1s Dec 16 12:50:12 crc kubenswrapper[4757]: > Dec 16 12:50:12 crc kubenswrapper[4757]: I1216 12:50:12.727388 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bca5421-7190-4386-9a4d-fc01e88be52e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8bca5421-7190-4386-9a4d-fc01e88be52e" (UID: "8bca5421-7190-4386-9a4d-fc01e88be52e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:50:12 crc kubenswrapper[4757]: I1216 12:50:12.767895 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgw9n\" (UniqueName: \"kubernetes.io/projected/8bca5421-7190-4386-9a4d-fc01e88be52e-kube-api-access-zgw9n\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:12 crc kubenswrapper[4757]: I1216 12:50:12.767980 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bca5421-7190-4386-9a4d-fc01e88be52e-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:12 crc kubenswrapper[4757]: I1216 12:50:12.767994 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bca5421-7190-4386-9a4d-fc01e88be52e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:12 crc kubenswrapper[4757]: I1216 12:50:12.879708 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srvcx" event={"ID":"8bca5421-7190-4386-9a4d-fc01e88be52e","Type":"ContainerDied","Data":"bdae43a78c0411976da8284ea438f5c12ce33702205089d919a1080ee4f2c0d8"} Dec 16 12:50:12 crc kubenswrapper[4757]: I1216 12:50:12.879759 4757 scope.go:117] "RemoveContainer" containerID="2708d42d36db97fb526b75fbb8863e35dcebbd18fa0878ddc9b221b35fb53229" Dec 16 12:50:12 crc kubenswrapper[4757]: I1216 12:50:12.879911 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srvcx" Dec 16 12:50:12 crc kubenswrapper[4757]: I1216 12:50:12.916087 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-srvcx"] Dec 16 12:50:12 crc kubenswrapper[4757]: I1216 12:50:12.928084 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-srvcx"] Dec 16 12:50:12 crc kubenswrapper[4757]: I1216 12:50:12.957533 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bca5421-7190-4386-9a4d-fc01e88be52e" path="/var/lib/kubelet/pods/8bca5421-7190-4386-9a4d-fc01e88be52e/volumes" Dec 16 12:50:13 crc kubenswrapper[4757]: I1216 12:50:13.635211 4757 scope.go:117] "RemoveContainer" containerID="85bb4b97b11ad53163f8952d48b4cd0711d53a4b4c89bbda058f326a7e32a969" Dec 16 12:50:16 crc kubenswrapper[4757]: I1216 12:50:16.948518 4757 scope.go:117] "RemoveContainer" containerID="f5fb11d547abc28016931b5b5569be3b6edf46368958ef9280e271e41c63e2c2" Dec 16 12:50:18 crc kubenswrapper[4757]: I1216 12:50:18.912584 4757 generic.go:334] "Generic (PLEG): container finished" podID="ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7" containerID="60d04004878d4312bd414e406e5f11f2a321a63b2f8fdd3410daf91962873d55" exitCode=0 Dec 16 12:50:18 crc kubenswrapper[4757]: I1216 12:50:18.912643 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbrff" event={"ID":"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7","Type":"ContainerDied","Data":"60d04004878d4312bd414e406e5f11f2a321a63b2f8fdd3410daf91962873d55"} Dec 16 12:50:18 crc kubenswrapper[4757]: I1216 12:50:18.918585 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ppdm" event={"ID":"c8ab79c2-762d-4773-ae6e-6e92acdf4508","Type":"ContainerStarted","Data":"84efb7b8f5531f8ba3ac8720a7a78e43fcb4c8b3a3c4b48062170d647cde0097"} Dec 16 12:50:18 crc kubenswrapper[4757]: I1216 12:50:18.921598 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krvn2" event={"ID":"17e402cb-44b0-4232-8671-b7db09c8e9b1","Type":"ContainerStarted","Data":"c87f661a3dd474d83b02f52699d3835e75e1fe68468d2ba07f74f28ea2bc0af7"} Dec 16 12:50:18 crc kubenswrapper[4757]: I1216 12:50:18.924852 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqtlh" event={"ID":"7320d121-c9e6-4af2-ad14-4db89ea38a9e","Type":"ContainerStarted","Data":"b686f8f19ee15982c905ef93dcb240a55e0e5ad12dc8f291031646e7cefdee41"} Dec 16 12:50:18 crc kubenswrapper[4757]: I1216 12:50:18.927592 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smj7p" event={"ID":"7c067ef6-5957-4cfd-be96-788f4236d990","Type":"ContainerStarted","Data":"3981ce847ea94a6d6649c922a8032fa4852cf2b6710288337a3f4cc057204370"} Dec 16 12:50:18 crc kubenswrapper[4757]: I1216 12:50:18.951990 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-krvn2" podStartSLOduration=10.318792667 podStartE2EDuration="58.951970872s" podCreationTimestamp="2025-12-16 12:49:20 +0000 UTC" firstStartedPulling="2025-12-16 12:49:23.896596568 +0000 UTC m=+149.324340364" lastFinishedPulling="2025-12-16 12:50:12.529774773 +0000 UTC m=+197.957518569" observedRunningTime="2025-12-16 12:50:18.951212912 +0000 UTC m=+204.378956718" watchObservedRunningTime="2025-12-16 12:50:18.951970872 +0000 UTC m=+204.379714668" Dec 16 12:50:18 crc kubenswrapper[4757]: I1216 12:50:18.977419 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7ppdm" podStartSLOduration=11.296824334 podStartE2EDuration="1m1.977397298s" podCreationTimestamp="2025-12-16 12:49:17 +0000 UTC" firstStartedPulling="2025-12-16 12:49:21.667417446 +0000 UTC m=+147.095161242" lastFinishedPulling="2025-12-16 12:50:12.34799041 +0000 UTC m=+197.775734206" observedRunningTime="2025-12-16 12:50:18.975294551 +0000 UTC m=+204.403038347" watchObservedRunningTime="2025-12-16 12:50:18.977397298 +0000 UTC m=+204.405141094" Dec 16 12:50:19 crc kubenswrapper[4757]: I1216 12:50:19.018688 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vqtlh" podStartSLOduration=3.89879481 podStartE2EDuration="1m1.018667235s" podCreationTimestamp="2025-12-16 12:49:18 +0000 UTC" firstStartedPulling="2025-12-16 12:49:20.556417304 +0000 UTC m=+145.984161090" lastFinishedPulling="2025-12-16 12:50:17.676289719 +0000 UTC m=+203.104033515" observedRunningTime="2025-12-16 12:50:19.016833145 +0000 UTC m=+204.444576941" watchObservedRunningTime="2025-12-16 12:50:19.018667235 +0000 UTC m=+204.446411041" Dec 16 12:50:19 crc kubenswrapper[4757]: I1216 12:50:19.935843 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbrff" event={"ID":"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7","Type":"ContainerStarted","Data":"2a9fbf8af755712b0c072c3ce2bfdce5f097762cfe5266e227eed6be6ce2ae14"} Dec 16 12:50:19 crc kubenswrapper[4757]: I1216 12:50:19.937424 4757 generic.go:334] "Generic (PLEG): container finished" podID="7c067ef6-5957-4cfd-be96-788f4236d990" containerID="3981ce847ea94a6d6649c922a8032fa4852cf2b6710288337a3f4cc057204370" exitCode=0 Dec 16 12:50:19 crc kubenswrapper[4757]: I1216 12:50:19.937528 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smj7p" event={"ID":"7c067ef6-5957-4cfd-be96-788f4236d990","Type":"ContainerDied","Data":"3981ce847ea94a6d6649c922a8032fa4852cf2b6710288337a3f4cc057204370"} Dec 16 12:50:19 crc kubenswrapper[4757]: I1216 12:50:19.954955 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bbrff" podStartSLOduration=4.234548081 podStartE2EDuration="1m1.954936101s" podCreationTimestamp="2025-12-16 12:49:18 +0000 UTC" firstStartedPulling="2025-12-16 12:49:21.663212051 +0000 UTC m=+147.090955847" lastFinishedPulling="2025-12-16 12:50:19.383600071 +0000 UTC m=+204.811343867" observedRunningTime="2025-12-16 12:50:19.954312893 +0000 UTC m=+205.382056739" watchObservedRunningTime="2025-12-16 12:50:19.954936101 +0000 UTC m=+205.382679907" Dec 16 12:50:20 crc kubenswrapper[4757]: I1216 12:50:20.487167 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-krvn2" Dec 16 12:50:20 crc kubenswrapper[4757]: I1216 12:50:20.488938 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-krvn2" Dec 16 12:50:20 crc kubenswrapper[4757]: I1216 12:50:20.528383 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-krvn2" Dec 16 12:50:20 crc kubenswrapper[4757]: I1216 12:50:20.808955 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x4dkc" Dec 16 12:50:20 crc kubenswrapper[4757]: I1216 12:50:20.809029 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x4dkc" Dec 16 12:50:20 crc kubenswrapper[4757]: I1216 12:50:20.855509 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x4dkc" Dec 16 12:50:20 crc kubenswrapper[4757]: I1216 12:50:20.943939 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smj7p" event={"ID":"7c067ef6-5957-4cfd-be96-788f4236d990","Type":"ContainerStarted","Data":"ac640ed7caadf3e8677469672278695a052c560f734ea3d2cf094717a3eef02f"} Dec 16 12:50:20 crc kubenswrapper[4757]: I1216 12:50:20.963544 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-smj7p" podStartSLOduration=3.17025455 podStartE2EDuration="59.963525486s" podCreationTimestamp="2025-12-16 12:49:21 +0000 UTC" firstStartedPulling="2025-12-16 12:49:23.812355758 +0000 UTC m=+149.240099554" lastFinishedPulling="2025-12-16 12:50:20.605626694 +0000 UTC m=+206.033370490" observedRunningTime="2025-12-16 12:50:20.962846296 +0000 UTC m=+206.390590102" watchObservedRunningTime="2025-12-16 12:50:20.963525486 +0000 UTC m=+206.391269292" Dec 16 12:50:20 crc kubenswrapper[4757]: I1216 12:50:20.994728 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x4dkc" Dec 16 12:50:21 crc kubenswrapper[4757]: I1216 12:50:21.181765 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 12:50:21 crc kubenswrapper[4757]: I1216 12:50:21.181834 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 12:50:21 crc kubenswrapper[4757]: I1216 12:50:21.181905 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 12:50:21 crc kubenswrapper[4757]: I1216 12:50:21.182500 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 12:50:21 crc kubenswrapper[4757]: I1216 12:50:21.182611 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3" gracePeriod=600 Dec 16 12:50:21 crc kubenswrapper[4757]: I1216 12:50:21.379154 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-smj7p" Dec 16 12:50:21 crc kubenswrapper[4757]: I1216 12:50:21.379419 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-smj7p" Dec 16 12:50:21 crc kubenswrapper[4757]: I1216 12:50:21.677849 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2hf5q" Dec 16 12:50:21 crc kubenswrapper[4757]: I1216 12:50:21.720244 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2hf5q" Dec 16 12:50:21 crc kubenswrapper[4757]: I1216 12:50:21.952101 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3" exitCode=0 Dec 16 12:50:21 crc kubenswrapper[4757]: I1216 12:50:21.952219 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3"} Dec 16 12:50:21 crc kubenswrapper[4757]: I1216 12:50:21.952287 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"caa3a93ad3bd3927512be6975f6d9bbe16d0438123c8248da65c133894f8be8b"} Dec 16 12:50:22 crc kubenswrapper[4757]: I1216 12:50:22.430804 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-smj7p" podUID="7c067ef6-5957-4cfd-be96-788f4236d990" containerName="registry-server" probeResult="failure" output=< Dec 16 12:50:22 crc kubenswrapper[4757]: timeout: failed to connect service ":50051" within 1s Dec 16 12:50:22 crc kubenswrapper[4757]: > Dec 16 12:50:25 crc kubenswrapper[4757]: I1216 12:50:25.082417 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4dkc"] Dec 16 12:50:25 crc kubenswrapper[4757]: I1216 12:50:25.082976 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x4dkc" podUID="2a2e530d-95e8-4bdc-8710-b08bb4d99f17" containerName="registry-server" containerID="cri-o://15831ddb2ce6723715048119347d8ac55268f5360472f12929b25aea9bff7522" gracePeriod=2 Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.085450 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2hf5q"] Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.085967 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2hf5q" podUID="1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1" containerName="registry-server" containerID="cri-o://4a32b065efcba5517738170671deaea43f8285660498613e20fa0589928f82f3" gracePeriod=2 Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.417457 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hf5q" Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.431174 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1-catalog-content\") pod \"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1\" (UID: \"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1\") " Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.431257 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1-utilities\") pod \"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1\" (UID: \"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1\") " Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.432468 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1-utilities" (OuterVolumeSpecName: "utilities") pod "1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1" (UID: "1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.532954 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48lxm\" (UniqueName: \"kubernetes.io/projected/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1-kube-api-access-48lxm\") pod \"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1\" (UID: \"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1\") " Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.542837 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1" (UID: "1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.543597 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.543620 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.548433 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1-kube-api-access-48lxm" (OuterVolumeSpecName: "kube-api-access-48lxm") pod "1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1" (UID: "1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1"). InnerVolumeSpecName "kube-api-access-48lxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.630372 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4dkc" Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.644207 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zzvr\" (UniqueName: \"kubernetes.io/projected/2a2e530d-95e8-4bdc-8710-b08bb4d99f17-kube-api-access-6zzvr\") pod \"2a2e530d-95e8-4bdc-8710-b08bb4d99f17\" (UID: \"2a2e530d-95e8-4bdc-8710-b08bb4d99f17\") " Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.644310 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a2e530d-95e8-4bdc-8710-b08bb4d99f17-catalog-content\") pod \"2a2e530d-95e8-4bdc-8710-b08bb4d99f17\" (UID: \"2a2e530d-95e8-4bdc-8710-b08bb4d99f17\") " Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.644429 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a2e530d-95e8-4bdc-8710-b08bb4d99f17-utilities\") pod \"2a2e530d-95e8-4bdc-8710-b08bb4d99f17\" (UID: \"2a2e530d-95e8-4bdc-8710-b08bb4d99f17\") " Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.644722 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48lxm\" (UniqueName: \"kubernetes.io/projected/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1-kube-api-access-48lxm\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.645615 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a2e530d-95e8-4bdc-8710-b08bb4d99f17-utilities" (OuterVolumeSpecName: "utilities") pod "2a2e530d-95e8-4bdc-8710-b08bb4d99f17" (UID: "2a2e530d-95e8-4bdc-8710-b08bb4d99f17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.647680 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a2e530d-95e8-4bdc-8710-b08bb4d99f17-kube-api-access-6zzvr" (OuterVolumeSpecName: "kube-api-access-6zzvr") pod "2a2e530d-95e8-4bdc-8710-b08bb4d99f17" (UID: "2a2e530d-95e8-4bdc-8710-b08bb4d99f17"). InnerVolumeSpecName "kube-api-access-6zzvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.670409 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a2e530d-95e8-4bdc-8710-b08bb4d99f17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a2e530d-95e8-4bdc-8710-b08bb4d99f17" (UID: "2a2e530d-95e8-4bdc-8710-b08bb4d99f17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.745941 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zzvr\" (UniqueName: \"kubernetes.io/projected/2a2e530d-95e8-4bdc-8710-b08bb4d99f17-kube-api-access-6zzvr\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.745978 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a2e530d-95e8-4bdc-8710-b08bb4d99f17-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.745989 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a2e530d-95e8-4bdc-8710-b08bb4d99f17-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.982199 4757 generic.go:334] "Generic (PLEG): container finished" podID="2a2e530d-95e8-4bdc-8710-b08bb4d99f17" containerID="15831ddb2ce6723715048119347d8ac55268f5360472f12929b25aea9bff7522" exitCode=0 Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.982259 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4dkc" Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.982278 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4dkc" event={"ID":"2a2e530d-95e8-4bdc-8710-b08bb4d99f17","Type":"ContainerDied","Data":"15831ddb2ce6723715048119347d8ac55268f5360472f12929b25aea9bff7522"} Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.982324 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4dkc" event={"ID":"2a2e530d-95e8-4bdc-8710-b08bb4d99f17","Type":"ContainerDied","Data":"92a85932e771ebf9368c9939a24cbb785b401ae69edec2d6a3959098f1ae7492"} Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.982346 4757 scope.go:117] "RemoveContainer" containerID="15831ddb2ce6723715048119347d8ac55268f5360472f12929b25aea9bff7522" Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.986138 4757 generic.go:334] "Generic (PLEG): container finished" podID="1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1" containerID="4a32b065efcba5517738170671deaea43f8285660498613e20fa0589928f82f3" exitCode=0 Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.986227 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hf5q" Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.986186 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hf5q" event={"ID":"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1","Type":"ContainerDied","Data":"4a32b065efcba5517738170671deaea43f8285660498613e20fa0589928f82f3"} Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.986883 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hf5q" event={"ID":"1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1","Type":"ContainerDied","Data":"de56d260429dddace7b2d2e604920802e32a72818b352bc9a30454fbca939ba7"} Dec 16 12:50:26 crc kubenswrapper[4757]: I1216 12:50:26.996900 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4dkc"] Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.004435 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4dkc"] Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.015273 4757 scope.go:117] "RemoveContainer" containerID="131416170b56c1778f99b2e5cc6020cf90ea57113640c3270b22cfe17672e172" Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.019822 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2hf5q"] Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.023601 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2hf5q"] Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.029617 4757 scope.go:117] "RemoveContainer" containerID="5a6c2d37b5633c81d1da4998e68e80ba878aa599fa842923e07daaf4e7876dd8" Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.044232 4757 scope.go:117] "RemoveContainer" containerID="15831ddb2ce6723715048119347d8ac55268f5360472f12929b25aea9bff7522" Dec 16 12:50:27 crc kubenswrapper[4757]: E1216 12:50:27.044662 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15831ddb2ce6723715048119347d8ac55268f5360472f12929b25aea9bff7522\": container with ID starting with 15831ddb2ce6723715048119347d8ac55268f5360472f12929b25aea9bff7522 not found: ID does not exist" containerID="15831ddb2ce6723715048119347d8ac55268f5360472f12929b25aea9bff7522" Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.044716 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15831ddb2ce6723715048119347d8ac55268f5360472f12929b25aea9bff7522"} err="failed to get container status \"15831ddb2ce6723715048119347d8ac55268f5360472f12929b25aea9bff7522\": rpc error: code = NotFound desc = could not find container \"15831ddb2ce6723715048119347d8ac55268f5360472f12929b25aea9bff7522\": container with ID starting with 15831ddb2ce6723715048119347d8ac55268f5360472f12929b25aea9bff7522 not found: ID does not exist" Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.044752 4757 scope.go:117] "RemoveContainer" containerID="131416170b56c1778f99b2e5cc6020cf90ea57113640c3270b22cfe17672e172" Dec 16 12:50:27 crc kubenswrapper[4757]: E1216 12:50:27.045162 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"131416170b56c1778f99b2e5cc6020cf90ea57113640c3270b22cfe17672e172\": container with ID starting with 131416170b56c1778f99b2e5cc6020cf90ea57113640c3270b22cfe17672e172 not found: ID does not exist" containerID="131416170b56c1778f99b2e5cc6020cf90ea57113640c3270b22cfe17672e172" Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.045201 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131416170b56c1778f99b2e5cc6020cf90ea57113640c3270b22cfe17672e172"} err="failed to get container status \"131416170b56c1778f99b2e5cc6020cf90ea57113640c3270b22cfe17672e172\": rpc error: code = NotFound desc = could not find container \"131416170b56c1778f99b2e5cc6020cf90ea57113640c3270b22cfe17672e172\": container with ID starting with 131416170b56c1778f99b2e5cc6020cf90ea57113640c3270b22cfe17672e172 not found: ID does not exist" Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.045234 4757 scope.go:117] "RemoveContainer" containerID="5a6c2d37b5633c81d1da4998e68e80ba878aa599fa842923e07daaf4e7876dd8" Dec 16 12:50:27 crc kubenswrapper[4757]: E1216 12:50:27.045450 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a6c2d37b5633c81d1da4998e68e80ba878aa599fa842923e07daaf4e7876dd8\": container with ID starting with 5a6c2d37b5633c81d1da4998e68e80ba878aa599fa842923e07daaf4e7876dd8 not found: ID does not exist" containerID="5a6c2d37b5633c81d1da4998e68e80ba878aa599fa842923e07daaf4e7876dd8" Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.045485 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a6c2d37b5633c81d1da4998e68e80ba878aa599fa842923e07daaf4e7876dd8"} err="failed to get container status \"5a6c2d37b5633c81d1da4998e68e80ba878aa599fa842923e07daaf4e7876dd8\": rpc error: code = NotFound desc = could not find container \"5a6c2d37b5633c81d1da4998e68e80ba878aa599fa842923e07daaf4e7876dd8\": container with ID starting with 5a6c2d37b5633c81d1da4998e68e80ba878aa599fa842923e07daaf4e7876dd8 not found: ID does not exist" Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.045503 4757 scope.go:117] "RemoveContainer" containerID="4a32b065efcba5517738170671deaea43f8285660498613e20fa0589928f82f3" Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.058341 4757 scope.go:117] "RemoveContainer" containerID="97aaa2c6ef19922ca90d590d602f83799273cf6856492273376b927dbd970219" Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.074120 4757 scope.go:117] "RemoveContainer" containerID="b783506b0a048717376e5843155496bc27094e4adaf4a38fa6c21117e78ff56c" Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.087881 4757 scope.go:117] "RemoveContainer" containerID="4a32b065efcba5517738170671deaea43f8285660498613e20fa0589928f82f3" Dec 16 12:50:27 crc kubenswrapper[4757]: E1216 12:50:27.088363 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a32b065efcba5517738170671deaea43f8285660498613e20fa0589928f82f3\": container with ID starting with 4a32b065efcba5517738170671deaea43f8285660498613e20fa0589928f82f3 not found: ID does not exist" containerID="4a32b065efcba5517738170671deaea43f8285660498613e20fa0589928f82f3" Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.088390 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a32b065efcba5517738170671deaea43f8285660498613e20fa0589928f82f3"} err="failed to get container status \"4a32b065efcba5517738170671deaea43f8285660498613e20fa0589928f82f3\": rpc error: code = NotFound desc = could not find container \"4a32b065efcba5517738170671deaea43f8285660498613e20fa0589928f82f3\": container with ID starting with 4a32b065efcba5517738170671deaea43f8285660498613e20fa0589928f82f3 not found: ID does not exist" Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.088412 4757 scope.go:117] "RemoveContainer" containerID="97aaa2c6ef19922ca90d590d602f83799273cf6856492273376b927dbd970219" Dec 16 12:50:27 crc kubenswrapper[4757]: E1216 12:50:27.088745 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97aaa2c6ef19922ca90d590d602f83799273cf6856492273376b927dbd970219\": container with ID starting with 97aaa2c6ef19922ca90d590d602f83799273cf6856492273376b927dbd970219 not found: ID does not exist" containerID="97aaa2c6ef19922ca90d590d602f83799273cf6856492273376b927dbd970219" Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.088787 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97aaa2c6ef19922ca90d590d602f83799273cf6856492273376b927dbd970219"} err="failed to get container status \"97aaa2c6ef19922ca90d590d602f83799273cf6856492273376b927dbd970219\": rpc error: code = NotFound desc = could not find container \"97aaa2c6ef19922ca90d590d602f83799273cf6856492273376b927dbd970219\": container with ID starting with 97aaa2c6ef19922ca90d590d602f83799273cf6856492273376b927dbd970219 not found: ID does not exist" Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.088800 4757 scope.go:117] "RemoveContainer" containerID="b783506b0a048717376e5843155496bc27094e4adaf4a38fa6c21117e78ff56c" Dec 16 12:50:27 crc kubenswrapper[4757]: E1216 12:50:27.089131 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b783506b0a048717376e5843155496bc27094e4adaf4a38fa6c21117e78ff56c\": container with ID starting with b783506b0a048717376e5843155496bc27094e4adaf4a38fa6c21117e78ff56c not found: ID does not exist" containerID="b783506b0a048717376e5843155496bc27094e4adaf4a38fa6c21117e78ff56c" Dec 16 12:50:27 crc kubenswrapper[4757]: I1216 12:50:27.089172 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b783506b0a048717376e5843155496bc27094e4adaf4a38fa6c21117e78ff56c"} err="failed to get container status \"b783506b0a048717376e5843155496bc27094e4adaf4a38fa6c21117e78ff56c\": rpc error: code = NotFound desc = could not find container \"b783506b0a048717376e5843155496bc27094e4adaf4a38fa6c21117e78ff56c\": container with ID starting with b783506b0a048717376e5843155496bc27094e4adaf4a38fa6c21117e78ff56c not found: ID does not exist" Dec 16 12:50:28 crc kubenswrapper[4757]: I1216 12:50:28.205116 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7ppdm" Dec 16 12:50:28 crc kubenswrapper[4757]: I1216 12:50:28.205461 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7ppdm" Dec 16 12:50:28 crc kubenswrapper[4757]: I1216 12:50:28.245947 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7ppdm" Dec 16 12:50:28 crc kubenswrapper[4757]: I1216 12:50:28.370304 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vqtlh" Dec 16 12:50:28 crc kubenswrapper[4757]: I1216 12:50:28.371237 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vqtlh" Dec 16 12:50:28 crc kubenswrapper[4757]: I1216 12:50:28.407998 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vqtlh" Dec 16 12:50:28 crc kubenswrapper[4757]: I1216 12:50:28.605489 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bbrff" Dec 16 12:50:28 crc kubenswrapper[4757]: I1216 12:50:28.605549 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bbrff" Dec 16 12:50:28 crc kubenswrapper[4757]: I1216 12:50:28.639118 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bbrff" Dec 16 12:50:28 crc kubenswrapper[4757]: I1216 12:50:28.957315 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1" path="/var/lib/kubelet/pods/1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1/volumes" Dec 16 12:50:28 crc kubenswrapper[4757]: I1216 12:50:28.958029 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a2e530d-95e8-4bdc-8710-b08bb4d99f17" path="/var/lib/kubelet/pods/2a2e530d-95e8-4bdc-8710-b08bb4d99f17/volumes" Dec 16 12:50:29 crc kubenswrapper[4757]: I1216 12:50:29.038672 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vqtlh" Dec 16 12:50:29 crc kubenswrapper[4757]: I1216 12:50:29.039548 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bbrff" Dec 16 12:50:29 crc kubenswrapper[4757]: I1216 12:50:29.047060 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7ppdm" Dec 16 12:50:30 crc kubenswrapper[4757]: I1216 12:50:30.535662 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-krvn2" Dec 16 12:50:31 crc kubenswrapper[4757]: I1216 12:50:31.427533 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-smj7p" Dec 16 12:50:31 crc kubenswrapper[4757]: I1216 12:50:31.474447 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-smj7p" Dec 16 12:50:31 crc kubenswrapper[4757]: I1216 12:50:31.489897 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bbrff"] Dec 16 12:50:31 crc kubenswrapper[4757]: I1216 12:50:31.490114 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bbrff" podUID="ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7" containerName="registry-server" containerID="cri-o://2a9fbf8af755712b0c072c3ce2bfdce5f097762cfe5266e227eed6be6ce2ae14" gracePeriod=2 Dec 16 12:50:32 crc kubenswrapper[4757]: I1216 12:50:32.812554 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbrff" Dec 16 12:50:32 crc kubenswrapper[4757]: I1216 12:50:32.838390 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7-catalog-content\") pod \"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7\" (UID: \"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7\") " Dec 16 12:50:32 crc kubenswrapper[4757]: I1216 12:50:32.838454 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7-utilities\") pod \"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7\" (UID: \"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7\") " Dec 16 12:50:32 crc kubenswrapper[4757]: I1216 12:50:32.838506 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7q6n\" (UniqueName: \"kubernetes.io/projected/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7-kube-api-access-q7q6n\") pod \"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7\" (UID: \"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7\") " Dec 16 12:50:32 crc kubenswrapper[4757]: I1216 12:50:32.840301 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7-utilities" (OuterVolumeSpecName: "utilities") pod "ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7" (UID: "ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:50:32 crc kubenswrapper[4757]: I1216 12:50:32.860381 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7-kube-api-access-q7q6n" (OuterVolumeSpecName: "kube-api-access-q7q6n") pod "ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7" (UID: "ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7"). InnerVolumeSpecName "kube-api-access-q7q6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:50:32 crc kubenswrapper[4757]: I1216 12:50:32.911031 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7" (UID: "ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:50:32 crc kubenswrapper[4757]: I1216 12:50:32.939818 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7q6n\" (UniqueName: \"kubernetes.io/projected/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7-kube-api-access-q7q6n\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:32 crc kubenswrapper[4757]: I1216 12:50:32.939895 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:32 crc kubenswrapper[4757]: I1216 12:50:32.939906 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:33 crc kubenswrapper[4757]: I1216 12:50:33.022391 4757 generic.go:334] "Generic (PLEG): container finished" podID="ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7" containerID="2a9fbf8af755712b0c072c3ce2bfdce5f097762cfe5266e227eed6be6ce2ae14" exitCode=0 Dec 16 12:50:33 crc kubenswrapper[4757]: I1216 12:50:33.022430 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbrff" event={"ID":"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7","Type":"ContainerDied","Data":"2a9fbf8af755712b0c072c3ce2bfdce5f097762cfe5266e227eed6be6ce2ae14"} Dec 16 12:50:33 crc kubenswrapper[4757]: I1216 12:50:33.022454 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbrff" event={"ID":"ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7","Type":"ContainerDied","Data":"a3e9c04c3b1022bd86a34470adf71fd5e2564643a59e613fcf0b475c880e27dd"} Dec 16 12:50:33 crc kubenswrapper[4757]: I1216 12:50:33.022471 4757 scope.go:117] "RemoveContainer" containerID="2a9fbf8af755712b0c072c3ce2bfdce5f097762cfe5266e227eed6be6ce2ae14" Dec 16 12:50:33 crc kubenswrapper[4757]: I1216 12:50:33.022572 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbrff" Dec 16 12:50:33 crc kubenswrapper[4757]: I1216 12:50:33.042795 4757 scope.go:117] "RemoveContainer" containerID="60d04004878d4312bd414e406e5f11f2a321a63b2f8fdd3410daf91962873d55" Dec 16 12:50:33 crc kubenswrapper[4757]: I1216 12:50:33.051125 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bbrff"] Dec 16 12:50:33 crc kubenswrapper[4757]: I1216 12:50:33.054252 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bbrff"] Dec 16 12:50:33 crc kubenswrapper[4757]: I1216 12:50:33.058613 4757 scope.go:117] "RemoveContainer" containerID="57766291ef0e249ea295ad5a1b26395a4c9538dc9af82fc22f1e35e1c8c3a587" Dec 16 12:50:33 crc kubenswrapper[4757]: I1216 12:50:33.078420 4757 scope.go:117] "RemoveContainer" containerID="2a9fbf8af755712b0c072c3ce2bfdce5f097762cfe5266e227eed6be6ce2ae14" Dec 16 12:50:33 crc kubenswrapper[4757]: E1216 12:50:33.079070 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a9fbf8af755712b0c072c3ce2bfdce5f097762cfe5266e227eed6be6ce2ae14\": container with ID starting with 2a9fbf8af755712b0c072c3ce2bfdce5f097762cfe5266e227eed6be6ce2ae14 not found: ID does not exist" containerID="2a9fbf8af755712b0c072c3ce2bfdce5f097762cfe5266e227eed6be6ce2ae14" Dec 16 12:50:33 crc kubenswrapper[4757]: I1216 12:50:33.079123 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a9fbf8af755712b0c072c3ce2bfdce5f097762cfe5266e227eed6be6ce2ae14"} err="failed to get container status \"2a9fbf8af755712b0c072c3ce2bfdce5f097762cfe5266e227eed6be6ce2ae14\": rpc error: code = NotFound desc = could not find container \"2a9fbf8af755712b0c072c3ce2bfdce5f097762cfe5266e227eed6be6ce2ae14\": container with ID starting with 2a9fbf8af755712b0c072c3ce2bfdce5f097762cfe5266e227eed6be6ce2ae14 not found: ID does not exist" Dec 16 12:50:33 crc kubenswrapper[4757]: I1216 12:50:33.079154 4757 scope.go:117] "RemoveContainer" containerID="60d04004878d4312bd414e406e5f11f2a321a63b2f8fdd3410daf91962873d55" Dec 16 12:50:33 crc kubenswrapper[4757]: E1216 12:50:33.079600 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60d04004878d4312bd414e406e5f11f2a321a63b2f8fdd3410daf91962873d55\": container with ID starting with 60d04004878d4312bd414e406e5f11f2a321a63b2f8fdd3410daf91962873d55 not found: ID does not exist" containerID="60d04004878d4312bd414e406e5f11f2a321a63b2f8fdd3410daf91962873d55" Dec 16 12:50:33 crc kubenswrapper[4757]: I1216 12:50:33.079641 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60d04004878d4312bd414e406e5f11f2a321a63b2f8fdd3410daf91962873d55"} err="failed to get container status \"60d04004878d4312bd414e406e5f11f2a321a63b2f8fdd3410daf91962873d55\": rpc error: code = NotFound desc = could not find container \"60d04004878d4312bd414e406e5f11f2a321a63b2f8fdd3410daf91962873d55\": container with ID starting with 60d04004878d4312bd414e406e5f11f2a321a63b2f8fdd3410daf91962873d55 not found: ID does not exist" Dec 16 12:50:33 crc kubenswrapper[4757]: I1216 12:50:33.079670 4757 scope.go:117] "RemoveContainer" containerID="57766291ef0e249ea295ad5a1b26395a4c9538dc9af82fc22f1e35e1c8c3a587" Dec 16 12:50:33 crc kubenswrapper[4757]: E1216 12:50:33.080067 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57766291ef0e249ea295ad5a1b26395a4c9538dc9af82fc22f1e35e1c8c3a587\": container with ID starting with 57766291ef0e249ea295ad5a1b26395a4c9538dc9af82fc22f1e35e1c8c3a587 not found: ID does not exist" containerID="57766291ef0e249ea295ad5a1b26395a4c9538dc9af82fc22f1e35e1c8c3a587" Dec 16 12:50:33 crc kubenswrapper[4757]: I1216 12:50:33.080153 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57766291ef0e249ea295ad5a1b26395a4c9538dc9af82fc22f1e35e1c8c3a587"} err="failed to get container status \"57766291ef0e249ea295ad5a1b26395a4c9538dc9af82fc22f1e35e1c8c3a587\": rpc error: code = NotFound desc = could not find container \"57766291ef0e249ea295ad5a1b26395a4c9538dc9af82fc22f1e35e1c8c3a587\": container with ID starting with 57766291ef0e249ea295ad5a1b26395a4c9538dc9af82fc22f1e35e1c8c3a587 not found: ID does not exist" Dec 16 12:50:34 crc kubenswrapper[4757]: I1216 12:50:34.807199 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" podUID="c67f2cc7-204e-4c8f-9c93-b02372c5c296" containerName="oauth-openshift" containerID="cri-o://ff0ea011e2afd0fa1092c8bafd867e2d454b7236061787f7b15f8e42f3081df5" gracePeriod=15 Dec 16 12:50:34 crc kubenswrapper[4757]: I1216 12:50:34.956301 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7" path="/var/lib/kubelet/pods/ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7/volumes" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.035862 4757 generic.go:334] "Generic (PLEG): container finished" podID="c67f2cc7-204e-4c8f-9c93-b02372c5c296" containerID="ff0ea011e2afd0fa1092c8bafd867e2d454b7236061787f7b15f8e42f3081df5" exitCode=0 Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.035903 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" event={"ID":"c67f2cc7-204e-4c8f-9c93-b02372c5c296","Type":"ContainerDied","Data":"ff0ea011e2afd0fa1092c8bafd867e2d454b7236061787f7b15f8e42f3081df5"} Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.170648 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.267145 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-audit-policies\") pod \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.267189 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqjtz\" (UniqueName: \"kubernetes.io/projected/c67f2cc7-204e-4c8f-9c93-b02372c5c296-kube-api-access-tqjtz\") pod \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.267213 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-cliconfig\") pod \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.267249 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-service-ca\") pod \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.267289 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-ocp-branding-template\") pod \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.267321 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-router-certs\") pod \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.267343 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-serving-cert\") pod \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.267363 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c67f2cc7-204e-4c8f-9c93-b02372c5c296-audit-dir\") pod \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.267384 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-template-provider-selection\") pod \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.267404 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-template-login\") pod \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.267429 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-template-error\") pod \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.267451 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-trusted-ca-bundle\") pod \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.267478 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-idp-0-file-data\") pod \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.267507 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-session\") pod \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\" (UID: \"c67f2cc7-204e-4c8f-9c93-b02372c5c296\") " Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.267879 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c67f2cc7-204e-4c8f-9c93-b02372c5c296-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c67f2cc7-204e-4c8f-9c93-b02372c5c296" (UID: "c67f2cc7-204e-4c8f-9c93-b02372c5c296"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.267947 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c67f2cc7-204e-4c8f-9c93-b02372c5c296" (UID: "c67f2cc7-204e-4c8f-9c93-b02372c5c296"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.267955 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c67f2cc7-204e-4c8f-9c93-b02372c5c296" (UID: "c67f2cc7-204e-4c8f-9c93-b02372c5c296"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.268548 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c67f2cc7-204e-4c8f-9c93-b02372c5c296" (UID: "c67f2cc7-204e-4c8f-9c93-b02372c5c296"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.269443 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c67f2cc7-204e-4c8f-9c93-b02372c5c296" (UID: "c67f2cc7-204e-4c8f-9c93-b02372c5c296"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.274623 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c67f2cc7-204e-4c8f-9c93-b02372c5c296" (UID: "c67f2cc7-204e-4c8f-9c93-b02372c5c296"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.274668 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c67f2cc7-204e-4c8f-9c93-b02372c5c296-kube-api-access-tqjtz" (OuterVolumeSpecName: "kube-api-access-tqjtz") pod "c67f2cc7-204e-4c8f-9c93-b02372c5c296" (UID: "c67f2cc7-204e-4c8f-9c93-b02372c5c296"). InnerVolumeSpecName "kube-api-access-tqjtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.275417 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c67f2cc7-204e-4c8f-9c93-b02372c5c296" (UID: "c67f2cc7-204e-4c8f-9c93-b02372c5c296"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.275909 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c67f2cc7-204e-4c8f-9c93-b02372c5c296" (UID: "c67f2cc7-204e-4c8f-9c93-b02372c5c296"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.277322 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c67f2cc7-204e-4c8f-9c93-b02372c5c296" (UID: "c67f2cc7-204e-4c8f-9c93-b02372c5c296"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.277459 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c67f2cc7-204e-4c8f-9c93-b02372c5c296" (UID: "c67f2cc7-204e-4c8f-9c93-b02372c5c296"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.277886 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c67f2cc7-204e-4c8f-9c93-b02372c5c296" (UID: "c67f2cc7-204e-4c8f-9c93-b02372c5c296"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.278210 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c67f2cc7-204e-4c8f-9c93-b02372c5c296" (UID: "c67f2cc7-204e-4c8f-9c93-b02372c5c296"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.283307 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c67f2cc7-204e-4c8f-9c93-b02372c5c296" (UID: "c67f2cc7-204e-4c8f-9c93-b02372c5c296"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.368523 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.368588 4757 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.368598 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqjtz\" (UniqueName: \"kubernetes.io/projected/c67f2cc7-204e-4c8f-9c93-b02372c5c296-kube-api-access-tqjtz\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.368607 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.368618 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.368632 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.368643 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.368654 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.368667 4757 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c67f2cc7-204e-4c8f-9c93-b02372c5c296-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.368678 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.368690 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.368699 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.368709 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:35 crc kubenswrapper[4757]: I1216 12:50:35.368719 4757 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c67f2cc7-204e-4c8f-9c93-b02372c5c296-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:36 crc kubenswrapper[4757]: I1216 12:50:36.041878 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" event={"ID":"c67f2cc7-204e-4c8f-9c93-b02372c5c296","Type":"ContainerDied","Data":"44e202b6b4fbaaabd0df44ec8e012d255a59b2b72d7d2018e903c5edc6667afd"} Dec 16 12:50:36 crc kubenswrapper[4757]: I1216 12:50:36.041949 4757 scope.go:117] "RemoveContainer" containerID="ff0ea011e2afd0fa1092c8bafd867e2d454b7236061787f7b15f8e42f3081df5" Dec 16 12:50:36 crc kubenswrapper[4757]: I1216 12:50:36.042071 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wzkwh" Dec 16 12:50:36 crc kubenswrapper[4757]: I1216 12:50:36.073254 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wzkwh"] Dec 16 12:50:36 crc kubenswrapper[4757]: I1216 12:50:36.078662 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wzkwh"] Dec 16 12:50:36 crc kubenswrapper[4757]: I1216 12:50:36.962997 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c67f2cc7-204e-4c8f-9c93-b02372c5c296" path="/var/lib/kubelet/pods/c67f2cc7-204e-4c8f-9c93-b02372c5c296/volumes" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.884379 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx"] Dec 16 12:50:44 crc kubenswrapper[4757]: E1216 12:50:44.885206 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a2e530d-95e8-4bdc-8710-b08bb4d99f17" containerName="registry-server" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885227 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a2e530d-95e8-4bdc-8710-b08bb4d99f17" containerName="registry-server" Dec 16 12:50:44 crc kubenswrapper[4757]: E1216 12:50:44.885242 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bca5421-7190-4386-9a4d-fc01e88be52e" containerName="registry-server" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885254 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bca5421-7190-4386-9a4d-fc01e88be52e" containerName="registry-server" Dec 16 12:50:44 crc kubenswrapper[4757]: E1216 12:50:44.885266 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="041b8f25-9797-41d0-9ae1-cb2a86f9c5a9" containerName="pruner" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885276 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="041b8f25-9797-41d0-9ae1-cb2a86f9c5a9" containerName="pruner" Dec 16 12:50:44 crc kubenswrapper[4757]: E1216 12:50:44.885286 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7" containerName="extract-content" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885294 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7" containerName="extract-content" Dec 16 12:50:44 crc kubenswrapper[4757]: E1216 12:50:44.885308 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1" containerName="extract-utilities" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885316 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1" containerName="extract-utilities" Dec 16 12:50:44 crc kubenswrapper[4757]: E1216 12:50:44.885328 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c67f2cc7-204e-4c8f-9c93-b02372c5c296" containerName="oauth-openshift" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885337 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67f2cc7-204e-4c8f-9c93-b02372c5c296" containerName="oauth-openshift" Dec 16 12:50:44 crc kubenswrapper[4757]: E1216 12:50:44.885349 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a2e530d-95e8-4bdc-8710-b08bb4d99f17" containerName="extract-utilities" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885357 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a2e530d-95e8-4bdc-8710-b08bb4d99f17" containerName="extract-utilities" Dec 16 12:50:44 crc kubenswrapper[4757]: E1216 12:50:44.885371 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a2e530d-95e8-4bdc-8710-b08bb4d99f17" containerName="extract-content" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885379 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a2e530d-95e8-4bdc-8710-b08bb4d99f17" containerName="extract-content" Dec 16 12:50:44 crc kubenswrapper[4757]: E1216 12:50:44.885389 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1" containerName="registry-server" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885397 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1" containerName="registry-server" Dec 16 12:50:44 crc kubenswrapper[4757]: E1216 12:50:44.885407 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7" containerName="extract-utilities" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885415 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7" containerName="extract-utilities" Dec 16 12:50:44 crc kubenswrapper[4757]: E1216 12:50:44.885428 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bca5421-7190-4386-9a4d-fc01e88be52e" containerName="extract-utilities" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885438 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bca5421-7190-4386-9a4d-fc01e88be52e" containerName="extract-utilities" Dec 16 12:50:44 crc kubenswrapper[4757]: E1216 12:50:44.885451 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1" containerName="extract-content" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885461 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1" containerName="extract-content" Dec 16 12:50:44 crc kubenswrapper[4757]: E1216 12:50:44.885473 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bca5421-7190-4386-9a4d-fc01e88be52e" containerName="extract-content" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885485 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bca5421-7190-4386-9a4d-fc01e88be52e" containerName="extract-content" Dec 16 12:50:44 crc kubenswrapper[4757]: E1216 12:50:44.885500 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7" containerName="registry-server" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885510 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7" containerName="registry-server" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885685 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed6d95be-fcd9-4d4f-9c1b-1e532d4b56a7" containerName="registry-server" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885702 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a2e530d-95e8-4bdc-8710-b08bb4d99f17" containerName="registry-server" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885720 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="c67f2cc7-204e-4c8f-9c93-b02372c5c296" containerName="oauth-openshift" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885733 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f1f22eb-cc7e-4057-a9af-a04d8fcf24f1" containerName="registry-server" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885747 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="041b8f25-9797-41d0-9ae1-cb2a86f9c5a9" containerName="pruner" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.885763 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bca5421-7190-4386-9a4d-fc01e88be52e" containerName="registry-server" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.886351 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.889978 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.890373 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.891550 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.891625 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.891631 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.891679 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.891680 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.891754 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.891788 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.891789 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.891988 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.892226 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.900914 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.907957 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.909756 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx"] Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.941606 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.984938 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.985090 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.985173 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.985388 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.985428 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-service-ca\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.985519 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-router-certs\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.985554 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-user-template-error\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.985572 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-audit-dir\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.985587 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-session\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.985612 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.985641 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.985663 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-user-template-login\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.985681 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwck6\" (UniqueName: \"kubernetes.io/projected/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-kube-api-access-xwck6\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:44 crc kubenswrapper[4757]: I1216 12:50:44.985738 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-audit-policies\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.086876 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.086983 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.087053 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-service-ca\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.087102 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-router-certs\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.087146 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-user-template-error\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.087451 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-audit-dir\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.087235 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-audit-dir\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.088262 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-session\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.088331 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.088392 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.088465 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-user-template-login\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.088513 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwck6\" (UniqueName: \"kubernetes.io/projected/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-kube-api-access-xwck6\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.088587 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-audit-policies\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.088660 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.088726 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.089400 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-service-ca\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.090181 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.091238 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-audit-policies\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.093524 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-user-template-login\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.088656 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.093681 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.093824 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-router-certs\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.094517 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-session\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.095598 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.096164 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.097886 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-user-template-error\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.105779 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.123476 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwck6\" (UniqueName: \"kubernetes.io/projected/34cc6dcb-ec21-4206-b22f-77cbde87f5d2-kube-api-access-xwck6\") pod \"oauth-openshift-5b4bb77c4-rg9wx\" (UID: \"34cc6dcb-ec21-4206-b22f-77cbde87f5d2\") " pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.206794 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:45 crc kubenswrapper[4757]: I1216 12:50:45.593359 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx"] Dec 16 12:50:46 crc kubenswrapper[4757]: I1216 12:50:46.096039 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" event={"ID":"34cc6dcb-ec21-4206-b22f-77cbde87f5d2","Type":"ContainerStarted","Data":"1af7a3036529a74727fcebacb668ed943127df0f9871bee4051d013b6eb766bd"} Dec 16 12:50:46 crc kubenswrapper[4757]: I1216 12:50:46.096086 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" event={"ID":"34cc6dcb-ec21-4206-b22f-77cbde87f5d2","Type":"ContainerStarted","Data":"31246d069e5afaaad2526be840d24fc4870382fc2de72509a928f0d1ab740460"} Dec 16 12:50:46 crc kubenswrapper[4757]: I1216 12:50:46.097188 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:46 crc kubenswrapper[4757]: I1216 12:50:46.118484 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" podStartSLOduration=37.118465253 podStartE2EDuration="37.118465253s" podCreationTimestamp="2025-12-16 12:50:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:50:46.115288605 +0000 UTC m=+231.543032411" watchObservedRunningTime="2025-12-16 12:50:46.118465253 +0000 UTC m=+231.546209049" Dec 16 12:50:46 crc kubenswrapper[4757]: I1216 12:50:46.296853 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5b4bb77c4-rg9wx" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.533608 4757 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.537782 4757 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.538290 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04" gracePeriod=15 Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.538481 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.538868 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57" gracePeriod=15 Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.538963 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa" gracePeriod=15 Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.539051 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a" gracePeriod=15 Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.538945 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28" gracePeriod=15 Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.547225 4757 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 12:50:47 crc kubenswrapper[4757]: E1216 12:50:47.549826 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.549850 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 16 12:50:47 crc kubenswrapper[4757]: E1216 12:50:47.549867 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.549875 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 16 12:50:47 crc kubenswrapper[4757]: E1216 12:50:47.549896 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.549903 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 12:50:47 crc kubenswrapper[4757]: E1216 12:50:47.549917 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.549924 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 16 12:50:47 crc kubenswrapper[4757]: E1216 12:50:47.549936 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.549943 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 16 12:50:47 crc kubenswrapper[4757]: E1216 12:50:47.549963 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.549970 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 16 12:50:47 crc kubenswrapper[4757]: E1216 12:50:47.549988 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.549994 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.550246 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.550264 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.550275 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.550290 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.550307 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.550656 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.577770 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.726280 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.726378 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.726408 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.726451 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.726985 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.727082 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.727127 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.727162 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.828610 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.828672 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.828696 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.828720 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.828739 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.828757 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.828791 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.828807 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.828871 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.828907 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.828926 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.828945 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.828965 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.828983 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.829022 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.829043 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: I1216 12:50:47.878774 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:50:47 crc kubenswrapper[4757]: W1216 12:50:47.898243 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-3edbdad47c3f29e861bf366f55d123cc041374072cd9817d980c1718ef75d415 WatchSource:0}: Error finding container 3edbdad47c3f29e861bf366f55d123cc041374072cd9817d980c1718ef75d415: Status 404 returned error can't find the container with id 3edbdad47c3f29e861bf366f55d123cc041374072cd9817d980c1718ef75d415 Dec 16 12:50:47 crc kubenswrapper[4757]: E1216 12:50:47.901269 4757 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.110:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1881b318c6b93fc9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-16 12:50:47.900561353 +0000 UTC m=+233.328305149,LastTimestamp:2025-12-16 12:50:47.900561353 +0000 UTC m=+233.328305149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 16 12:50:48 crc kubenswrapper[4757]: I1216 12:50:48.109921 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 16 12:50:48 crc kubenswrapper[4757]: I1216 12:50:48.111987 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 12:50:48 crc kubenswrapper[4757]: I1216 12:50:48.113980 4757 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57" exitCode=0 Dec 16 12:50:48 crc kubenswrapper[4757]: I1216 12:50:48.114093 4757 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28" exitCode=0 Dec 16 12:50:48 crc kubenswrapper[4757]: I1216 12:50:48.114104 4757 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa" exitCode=0 Dec 16 12:50:48 crc kubenswrapper[4757]: I1216 12:50:48.114113 4757 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a" exitCode=2 Dec 16 12:50:48 crc kubenswrapper[4757]: I1216 12:50:48.114194 4757 scope.go:117] "RemoveContainer" containerID="568f53eadf118636677125cfa563d036e90c3d19a0835a153c7451f00d79b3a9" Dec 16 12:50:48 crc kubenswrapper[4757]: I1216 12:50:48.116748 4757 generic.go:334] "Generic (PLEG): container finished" podID="8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18" containerID="30350ecdc40e5317b6e3b511899e7a3ea245816a20f062ef69b715f4383cff03" exitCode=0 Dec 16 12:50:48 crc kubenswrapper[4757]: I1216 12:50:48.116857 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18","Type":"ContainerDied","Data":"30350ecdc40e5317b6e3b511899e7a3ea245816a20f062ef69b715f4383cff03"} Dec 16 12:50:48 crc kubenswrapper[4757]: I1216 12:50:48.117926 4757 status_manager.go:851] "Failed to get status for pod" podUID="8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:48 crc kubenswrapper[4757]: I1216 12:50:48.118437 4757 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:48 crc kubenswrapper[4757]: I1216 12:50:48.118871 4757 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:48 crc kubenswrapper[4757]: I1216 12:50:48.123654 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"3edbdad47c3f29e861bf366f55d123cc041374072cd9817d980c1718ef75d415"} Dec 16 12:50:48 crc kubenswrapper[4757]: E1216 12:50:48.521993 4757 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.110:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1881b318c6b93fc9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-16 12:50:47.900561353 +0000 UTC m=+233.328305149,LastTimestamp:2025-12-16 12:50:47.900561353 +0000 UTC m=+233.328305149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.136778 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.139847 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"fad05903fc52ace047ff7f12cb505827333a3ff7e7aff802b3c3b9f98b8cc4f6"} Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.140358 4757 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.140797 4757 status_manager.go:851] "Failed to get status for pod" podUID="8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.366237 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.367073 4757 status_manager.go:851] "Failed to get status for pod" podUID="8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.367536 4757 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.551673 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18-kubelet-dir\") pod \"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18\" (UID: \"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18\") " Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.551773 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18-var-lock\") pod \"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18\" (UID: \"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18\") " Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.551816 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18-kube-api-access\") pod \"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18\" (UID: \"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18\") " Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.551812 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18" (UID: "8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.551849 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18-var-lock" (OuterVolumeSpecName: "var-lock") pod "8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18" (UID: "8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.551993 4757 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.552045 4757 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18-var-lock\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.556817 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18" (UID: "8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.654958 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.937120 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.938482 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.939178 4757 status_manager.go:851] "Failed to get status for pod" podUID="8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.939495 4757 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:49 crc kubenswrapper[4757]: I1216 12:50:49.939907 4757 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.007853 4757 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.008275 4757 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.008709 4757 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.009028 4757 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.009303 4757 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.009340 4757 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.009519 4757 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="200ms" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.059925 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.060079 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.060114 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.060170 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.060234 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.060226 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.060605 4757 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.060640 4757 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.060654 4757 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.145678 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18","Type":"ContainerDied","Data":"eb6c0222c0b37dd81e65b033bf3cfe2a06ce581f75050fa634f12915808433b7"} Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.145706 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.145725 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb6c0222c0b37dd81e65b033bf3cfe2a06ce581f75050fa634f12915808433b7" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.149231 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.149925 4757 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04" exitCode=0 Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.149999 4757 scope.go:117] "RemoveContainer" containerID="966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.150202 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.158480 4757 status_manager.go:851] "Failed to get status for pod" podUID="8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.158917 4757 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.159112 4757 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.165321 4757 scope.go:117] "RemoveContainer" containerID="a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.166113 4757 status_manager.go:851] "Failed to get status for pod" podUID="8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.166433 4757 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.166769 4757 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.179252 4757 scope.go:117] "RemoveContainer" containerID="efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.192434 4757 scope.go:117] "RemoveContainer" containerID="3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.202857 4757 scope.go:117] "RemoveContainer" containerID="a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.210936 4757 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="400ms" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.217848 4757 scope.go:117] "RemoveContainer" containerID="eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.238195 4757 scope.go:117] "RemoveContainer" containerID="966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.238607 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\": container with ID starting with 966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57 not found: ID does not exist" containerID="966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.238642 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57"} err="failed to get container status \"966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\": rpc error: code = NotFound desc = could not find container \"966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57\": container with ID starting with 966a2858611244721ba061a67c08ba39d190eebbfbcff8f2b9b6e28dafb3fa57 not found: ID does not exist" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.238671 4757 scope.go:117] "RemoveContainer" containerID="a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.240808 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\": container with ID starting with a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28 not found: ID does not exist" containerID="a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.240861 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28"} err="failed to get container status \"a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\": rpc error: code = NotFound desc = could not find container \"a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28\": container with ID starting with a2834a0e84d49470fe5add3dfb85fae99744ace6c593b235e73933945c5e0e28 not found: ID does not exist" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.240923 4757 scope.go:117] "RemoveContainer" containerID="efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.241520 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\": container with ID starting with efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa not found: ID does not exist" containerID="efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.241553 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa"} err="failed to get container status \"efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\": rpc error: code = NotFound desc = could not find container \"efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa\": container with ID starting with efd8b6975efba720cfb69c2c988c4f67da1fe6beb7b7474a9341d2d0886031fa not found: ID does not exist" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.241573 4757 scope.go:117] "RemoveContainer" containerID="3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.241863 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\": container with ID starting with 3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a not found: ID does not exist" containerID="3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.241896 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a"} err="failed to get container status \"3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\": rpc error: code = NotFound desc = could not find container \"3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a\": container with ID starting with 3e153366831581a51c26ccad055a797e6f673d21429ceb554938827fe482eb9a not found: ID does not exist" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.241918 4757 scope.go:117] "RemoveContainer" containerID="a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.242181 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\": container with ID starting with a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04 not found: ID does not exist" containerID="a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.242209 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04"} err="failed to get container status \"a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\": rpc error: code = NotFound desc = could not find container \"a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04\": container with ID starting with a854826ed0b11d5b1e6efaa04de4c3fcf12da3dd222ad8695788b0e63f61bf04 not found: ID does not exist" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.242225 4757 scope.go:117] "RemoveContainer" containerID="eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.242607 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\": container with ID starting with eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58 not found: ID does not exist" containerID="eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.242632 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58"} err="failed to get container status \"eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\": rpc error: code = NotFound desc = could not find container \"eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58\": container with ID starting with eb5ba64283814570349abe9c7642c712dac3a260166d4617f16421be24e70c58 not found: ID does not exist" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.612550 4757 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="800ms" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.832494 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:50:50Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:50:50Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:50:50Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T12:50:50Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.832844 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.833314 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.833618 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.833994 4757 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:50 crc kubenswrapper[4757]: E1216 12:50:50.834037 4757 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 12:50:50 crc kubenswrapper[4757]: I1216 12:50:50.956295 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 16 12:50:51 crc kubenswrapper[4757]: E1216 12:50:51.414148 4757 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="1.6s" Dec 16 12:50:53 crc kubenswrapper[4757]: E1216 12:50:53.014798 4757 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="3.2s" Dec 16 12:50:54 crc kubenswrapper[4757]: I1216 12:50:54.952453 4757 status_manager.go:851] "Failed to get status for pod" podUID="8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:54 crc kubenswrapper[4757]: I1216 12:50:54.952830 4757 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:50:56 crc kubenswrapper[4757]: E1216 12:50:56.215897 4757 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="6.4s" Dec 16 12:50:58 crc kubenswrapper[4757]: E1216 12:50:58.523799 4757 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.110:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1881b318c6b93fc9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-16 12:50:47.900561353 +0000 UTC m=+233.328305149,LastTimestamp:2025-12-16 12:50:47.900561353 +0000 UTC m=+233.328305149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 16 12:51:00 crc kubenswrapper[4757]: E1216 12:51:00.993185 4757 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" volumeName="registry-storage" Dec 16 12:51:01 crc kubenswrapper[4757]: I1216 12:51:01.948524 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:51:01 crc kubenswrapper[4757]: I1216 12:51:01.949426 4757 status_manager.go:851] "Failed to get status for pod" podUID="8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:51:01 crc kubenswrapper[4757]: I1216 12:51:01.949768 4757 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:51:01 crc kubenswrapper[4757]: I1216 12:51:01.966758 4757 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="149ec790-f813-4055-8986-3674f9b10732" Dec 16 12:51:01 crc kubenswrapper[4757]: I1216 12:51:01.966810 4757 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="149ec790-f813-4055-8986-3674f9b10732" Dec 16 12:51:01 crc kubenswrapper[4757]: E1216 12:51:01.967479 4757 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:51:01 crc kubenswrapper[4757]: I1216 12:51:01.967937 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:51:01 crc kubenswrapper[4757]: W1216 12:51:01.990248 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-90a137f004667d9b1d4c7942b5c8f6631f2d069eb0b3dda2a807603d8902b20f WatchSource:0}: Error finding container 90a137f004667d9b1d4c7942b5c8f6631f2d069eb0b3dda2a807603d8902b20f: Status 404 returned error can't find the container with id 90a137f004667d9b1d4c7942b5c8f6631f2d069eb0b3dda2a807603d8902b20f Dec 16 12:51:02 crc kubenswrapper[4757]: I1216 12:51:02.220940 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"90a137f004667d9b1d4c7942b5c8f6631f2d069eb0b3dda2a807603d8902b20f"} Dec 16 12:51:02 crc kubenswrapper[4757]: E1216 12:51:02.617130 4757 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.110:6443: connect: connection refused" interval="7s" Dec 16 12:51:03 crc kubenswrapper[4757]: I1216 12:51:03.228418 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 16 12:51:03 crc kubenswrapper[4757]: I1216 12:51:03.228467 4757 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c" exitCode=1 Dec 16 12:51:03 crc kubenswrapper[4757]: I1216 12:51:03.228523 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c"} Dec 16 12:51:03 crc kubenswrapper[4757]: I1216 12:51:03.228981 4757 scope.go:117] "RemoveContainer" containerID="9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c" Dec 16 12:51:03 crc kubenswrapper[4757]: I1216 12:51:03.229483 4757 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:51:03 crc kubenswrapper[4757]: I1216 12:51:03.229733 4757 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:51:03 crc kubenswrapper[4757]: I1216 12:51:03.230025 4757 status_manager.go:851] "Failed to get status for pod" podUID="8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:51:03 crc kubenswrapper[4757]: I1216 12:51:03.230686 4757 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="834741c2a5cb131292275cf12bba218fa34a9685c480e2d66f84ea5885088b06" exitCode=0 Dec 16 12:51:03 crc kubenswrapper[4757]: I1216 12:51:03.230747 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"834741c2a5cb131292275cf12bba218fa34a9685c480e2d66f84ea5885088b06"} Dec 16 12:51:03 crc kubenswrapper[4757]: I1216 12:51:03.231052 4757 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="149ec790-f813-4055-8986-3674f9b10732" Dec 16 12:51:03 crc kubenswrapper[4757]: I1216 12:51:03.231087 4757 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="149ec790-f813-4055-8986-3674f9b10732" Dec 16 12:51:03 crc kubenswrapper[4757]: I1216 12:51:03.231317 4757 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:51:03 crc kubenswrapper[4757]: E1216 12:51:03.231324 4757 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:51:03 crc kubenswrapper[4757]: I1216 12:51:03.231695 4757 status_manager.go:851] "Failed to get status for pod" podUID="8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:51:03 crc kubenswrapper[4757]: I1216 12:51:03.231986 4757 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.110:6443: connect: connection refused" Dec 16 12:51:04 crc kubenswrapper[4757]: I1216 12:51:04.247274 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 16 12:51:04 crc kubenswrapper[4757]: I1216 12:51:04.247895 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"31cc4cc27906ee7e5415e9f8ee9159e8dbf3106af6cdc17089aebbc68aae2cd6"} Dec 16 12:51:04 crc kubenswrapper[4757]: I1216 12:51:04.269232 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"058c091392b11f643b568e0da88df35d33ae2abe63ad208a6831b994d37e9fe3"} Dec 16 12:51:04 crc kubenswrapper[4757]: I1216 12:51:04.272174 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0ccb8942a641a087a40d8bafb616a96b192c2314564b718c570e561c954d8603"} Dec 16 12:51:04 crc kubenswrapper[4757]: I1216 12:51:04.272191 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c9cc95f5231b7ddb296b6de7f652df4a79e9176d97fbec1e2b556ce13229c472"} Dec 16 12:51:04 crc kubenswrapper[4757]: I1216 12:51:04.272205 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fd2fee0c8214ff32eec8b1e54a8c60bad2c07ec207436c61ab7beb5ddf2b194f"} Dec 16 12:51:05 crc kubenswrapper[4757]: I1216 12:51:05.278462 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f1d30164d837d4ff26de987cbe1e0116a484f5cc9ff07089637b912813b80f19"} Dec 16 12:51:05 crc kubenswrapper[4757]: I1216 12:51:05.278660 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:51:05 crc kubenswrapper[4757]: I1216 12:51:05.278769 4757 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="149ec790-f813-4055-8986-3674f9b10732" Dec 16 12:51:05 crc kubenswrapper[4757]: I1216 12:51:05.278796 4757 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="149ec790-f813-4055-8986-3674f9b10732" Dec 16 12:51:06 crc kubenswrapper[4757]: I1216 12:51:06.291107 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:51:06 crc kubenswrapper[4757]: I1216 12:51:06.295754 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:51:06 crc kubenswrapper[4757]: I1216 12:51:06.968727 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:51:06 crc kubenswrapper[4757]: I1216 12:51:06.968777 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:51:06 crc kubenswrapper[4757]: I1216 12:51:06.973430 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:51:07 crc kubenswrapper[4757]: I1216 12:51:07.289046 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:51:10 crc kubenswrapper[4757]: I1216 12:51:10.295899 4757 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:51:11 crc kubenswrapper[4757]: I1216 12:51:11.307994 4757 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="149ec790-f813-4055-8986-3674f9b10732" Dec 16 12:51:11 crc kubenswrapper[4757]: I1216 12:51:11.308427 4757 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="149ec790-f813-4055-8986-3674f9b10732" Dec 16 12:51:11 crc kubenswrapper[4757]: I1216 12:51:11.311834 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:51:11 crc kubenswrapper[4757]: I1216 12:51:11.314565 4757 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="62d7762f-d264-414c-ad4f-765fefbc17c1" Dec 16 12:51:12 crc kubenswrapper[4757]: I1216 12:51:12.313392 4757 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="149ec790-f813-4055-8986-3674f9b10732" Dec 16 12:51:12 crc kubenswrapper[4757]: I1216 12:51:12.313437 4757 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="149ec790-f813-4055-8986-3674f9b10732" Dec 16 12:51:14 crc kubenswrapper[4757]: I1216 12:51:14.965770 4757 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="62d7762f-d264-414c-ad4f-765fefbc17c1" Dec 16 12:51:20 crc kubenswrapper[4757]: I1216 12:51:20.523720 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 16 12:51:20 crc kubenswrapper[4757]: I1216 12:51:20.704838 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 16 12:51:20 crc kubenswrapper[4757]: I1216 12:51:20.751441 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 16 12:51:20 crc kubenswrapper[4757]: I1216 12:51:20.918377 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 16 12:51:21 crc kubenswrapper[4757]: I1216 12:51:21.035688 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 16 12:51:21 crc kubenswrapper[4757]: I1216 12:51:21.038317 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 16 12:51:21 crc kubenswrapper[4757]: I1216 12:51:21.130275 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 16 12:51:21 crc kubenswrapper[4757]: I1216 12:51:21.193635 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 16 12:51:21 crc kubenswrapper[4757]: I1216 12:51:21.476749 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 16 12:51:21 crc kubenswrapper[4757]: I1216 12:51:21.521494 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 12:51:21 crc kubenswrapper[4757]: I1216 12:51:21.541184 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 16 12:51:21 crc kubenswrapper[4757]: I1216 12:51:21.709852 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 12:51:22 crc kubenswrapper[4757]: I1216 12:51:22.061542 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 16 12:51:22 crc kubenswrapper[4757]: I1216 12:51:22.064836 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 16 12:51:22 crc kubenswrapper[4757]: I1216 12:51:22.068214 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 16 12:51:22 crc kubenswrapper[4757]: I1216 12:51:22.192468 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 16 12:51:22 crc kubenswrapper[4757]: I1216 12:51:22.858972 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 16 12:51:22 crc kubenswrapper[4757]: I1216 12:51:22.975885 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 16 12:51:23 crc kubenswrapper[4757]: I1216 12:51:23.057267 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 16 12:51:23 crc kubenswrapper[4757]: I1216 12:51:23.130181 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 16 12:51:23 crc kubenswrapper[4757]: I1216 12:51:23.133623 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 16 12:51:23 crc kubenswrapper[4757]: I1216 12:51:23.241601 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 16 12:51:23 crc kubenswrapper[4757]: I1216 12:51:23.274290 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 16 12:51:23 crc kubenswrapper[4757]: I1216 12:51:23.310354 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 16 12:51:23 crc kubenswrapper[4757]: I1216 12:51:23.348657 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 16 12:51:23 crc kubenswrapper[4757]: I1216 12:51:23.670199 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 16 12:51:23 crc kubenswrapper[4757]: I1216 12:51:23.724712 4757 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 16 12:51:23 crc kubenswrapper[4757]: I1216 12:51:23.724876 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 16 12:51:23 crc kubenswrapper[4757]: I1216 12:51:23.768692 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 16 12:51:23 crc kubenswrapper[4757]: I1216 12:51:23.835490 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 16 12:51:23 crc kubenswrapper[4757]: I1216 12:51:23.905571 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 16 12:51:24 crc kubenswrapper[4757]: I1216 12:51:24.162683 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 16 12:51:24 crc kubenswrapper[4757]: I1216 12:51:24.174660 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 16 12:51:24 crc kubenswrapper[4757]: I1216 12:51:24.191779 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 16 12:51:24 crc kubenswrapper[4757]: I1216 12:51:24.223830 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 16 12:51:24 crc kubenswrapper[4757]: I1216 12:51:24.255493 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 16 12:51:24 crc kubenswrapper[4757]: I1216 12:51:24.279505 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 16 12:51:24 crc kubenswrapper[4757]: I1216 12:51:24.323868 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 16 12:51:24 crc kubenswrapper[4757]: I1216 12:51:24.415224 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 16 12:51:24 crc kubenswrapper[4757]: I1216 12:51:24.486613 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 12:51:24 crc kubenswrapper[4757]: I1216 12:51:24.497947 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 16 12:51:24 crc kubenswrapper[4757]: I1216 12:51:24.601373 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 16 12:51:24 crc kubenswrapper[4757]: I1216 12:51:24.632655 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 16 12:51:24 crc kubenswrapper[4757]: I1216 12:51:24.667951 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 16 12:51:24 crc kubenswrapper[4757]: I1216 12:51:24.677520 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 16 12:51:24 crc kubenswrapper[4757]: I1216 12:51:24.801717 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 16 12:51:24 crc kubenswrapper[4757]: I1216 12:51:24.978430 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 16 12:51:24 crc kubenswrapper[4757]: I1216 12:51:24.990876 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 16 12:51:25 crc kubenswrapper[4757]: I1216 12:51:25.103072 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 12:51:25 crc kubenswrapper[4757]: I1216 12:51:25.124597 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 16 12:51:25 crc kubenswrapper[4757]: I1216 12:51:25.153755 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 16 12:51:25 crc kubenswrapper[4757]: I1216 12:51:25.191026 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 16 12:51:25 crc kubenswrapper[4757]: I1216 12:51:25.251974 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 16 12:51:25 crc kubenswrapper[4757]: I1216 12:51:25.272373 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 16 12:51:25 crc kubenswrapper[4757]: I1216 12:51:25.491391 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 16 12:51:25 crc kubenswrapper[4757]: I1216 12:51:25.650313 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 16 12:51:25 crc kubenswrapper[4757]: I1216 12:51:25.678118 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 16 12:51:25 crc kubenswrapper[4757]: I1216 12:51:25.727754 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 12:51:25 crc kubenswrapper[4757]: I1216 12:51:25.777403 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 16 12:51:25 crc kubenswrapper[4757]: I1216 12:51:25.932342 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 16 12:51:25 crc kubenswrapper[4757]: I1216 12:51:25.972411 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 16 12:51:25 crc kubenswrapper[4757]: I1216 12:51:25.976746 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.012597 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.098920 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.128713 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.190251 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.229095 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.253863 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.281616 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.292067 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.319205 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.321412 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.381046 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.426913 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.467226 4757 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.470071 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=39.470049276 podStartE2EDuration="39.470049276s" podCreationTimestamp="2025-12-16 12:50:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:51:10.166174915 +0000 UTC m=+255.593918711" watchObservedRunningTime="2025-12-16 12:51:26.470049276 +0000 UTC m=+271.897793072" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.472255 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.472306 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.477760 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.496245 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.496229163 podStartE2EDuration="16.496229163s" podCreationTimestamp="2025-12-16 12:51:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:51:26.494224503 +0000 UTC m=+271.921968319" watchObservedRunningTime="2025-12-16 12:51:26.496229163 +0000 UTC m=+271.923972959" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.522242 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.531923 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.538182 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.581109 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.608659 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.641289 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.653760 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.667907 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.781978 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.840734 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.881403 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.914522 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 16 12:51:26 crc kubenswrapper[4757]: I1216 12:51:26.984213 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.067628 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.123530 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.158510 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.195162 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.249583 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.295248 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.362757 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.370067 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.440088 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.541312 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.558489 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.585118 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.601697 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.619496 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.622967 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.624070 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.680826 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.708179 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.724887 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.770530 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.812347 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.845211 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.862050 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.862059 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.946250 4757 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 16 12:51:27 crc kubenswrapper[4757]: I1216 12:51:27.987779 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.004059 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.006200 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.009118 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.072734 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.102400 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.130268 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.141611 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.298423 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.321826 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.346893 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.360918 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.437792 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.572520 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.600049 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.627824 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.628860 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.655974 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.667867 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.761918 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.793808 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.796776 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.825664 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 16 12:51:28 crc kubenswrapper[4757]: I1216 12:51:28.947144 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.074132 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.109768 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.137373 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.171351 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.304918 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.352969 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.428845 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.487312 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.491302 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.508034 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.607234 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.658186 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.662867 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.666904 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.690359 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.716495 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.889792 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.936908 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 16 12:51:29 crc kubenswrapper[4757]: I1216 12:51:29.949088 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 16 12:51:30 crc kubenswrapper[4757]: I1216 12:51:30.028719 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 12:51:30 crc kubenswrapper[4757]: I1216 12:51:30.047227 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 16 12:51:30 crc kubenswrapper[4757]: I1216 12:51:30.206476 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 16 12:51:30 crc kubenswrapper[4757]: I1216 12:51:30.324744 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 12:51:30 crc kubenswrapper[4757]: I1216 12:51:30.350969 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 16 12:51:30 crc kubenswrapper[4757]: I1216 12:51:30.351094 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 16 12:51:30 crc kubenswrapper[4757]: I1216 12:51:30.409927 4757 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 16 12:51:30 crc kubenswrapper[4757]: I1216 12:51:30.411985 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 16 12:51:30 crc kubenswrapper[4757]: I1216 12:51:30.534857 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 16 12:51:30 crc kubenswrapper[4757]: I1216 12:51:30.537718 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 16 12:51:30 crc kubenswrapper[4757]: I1216 12:51:30.575299 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 16 12:51:30 crc kubenswrapper[4757]: I1216 12:51:30.577951 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 16 12:51:30 crc kubenswrapper[4757]: I1216 12:51:30.723761 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 16 12:51:30 crc kubenswrapper[4757]: I1216 12:51:30.778177 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 16 12:51:30 crc kubenswrapper[4757]: I1216 12:51:30.794229 4757 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 16 12:51:30 crc kubenswrapper[4757]: I1216 12:51:30.862689 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 16 12:51:30 crc kubenswrapper[4757]: I1216 12:51:30.908170 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.006462 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.008307 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.009820 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.150235 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.172084 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.183363 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.238141 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.426908 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.433756 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.471993 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.510161 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.594117 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.602350 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.619715 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.640347 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.651223 4757 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.707664 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.825517 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.862796 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.881428 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.961732 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 16 12:51:31 crc kubenswrapper[4757]: I1216 12:51:31.974494 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 16 12:51:32 crc kubenswrapper[4757]: I1216 12:51:32.001437 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 16 12:51:32 crc kubenswrapper[4757]: I1216 12:51:32.084405 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 16 12:51:32 crc kubenswrapper[4757]: I1216 12:51:32.093925 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 16 12:51:32 crc kubenswrapper[4757]: I1216 12:51:32.123435 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 16 12:51:32 crc kubenswrapper[4757]: I1216 12:51:32.182949 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 16 12:51:32 crc kubenswrapper[4757]: I1216 12:51:32.341866 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 16 12:51:32 crc kubenswrapper[4757]: I1216 12:51:32.430748 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 16 12:51:32 crc kubenswrapper[4757]: I1216 12:51:32.459671 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 16 12:51:32 crc kubenswrapper[4757]: I1216 12:51:32.505885 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 16 12:51:32 crc kubenswrapper[4757]: I1216 12:51:32.519115 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 16 12:51:32 crc kubenswrapper[4757]: I1216 12:51:32.530520 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 16 12:51:32 crc kubenswrapper[4757]: I1216 12:51:32.627551 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 16 12:51:32 crc kubenswrapper[4757]: I1216 12:51:32.627550 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 16 12:51:32 crc kubenswrapper[4757]: I1216 12:51:32.727607 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 16 12:51:32 crc kubenswrapper[4757]: I1216 12:51:32.856876 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 16 12:51:32 crc kubenswrapper[4757]: I1216 12:51:32.881948 4757 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 12:51:32 crc kubenswrapper[4757]: I1216 12:51:32.882185 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://fad05903fc52ace047ff7f12cb505827333a3ff7e7aff802b3c3b9f98b8cc4f6" gracePeriod=5 Dec 16 12:51:32 crc kubenswrapper[4757]: I1216 12:51:32.923929 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 16 12:51:33 crc kubenswrapper[4757]: I1216 12:51:33.006611 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 16 12:51:33 crc kubenswrapper[4757]: I1216 12:51:33.008444 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 16 12:51:33 crc kubenswrapper[4757]: I1216 12:51:33.121310 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 12:51:33 crc kubenswrapper[4757]: I1216 12:51:33.272531 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 16 12:51:33 crc kubenswrapper[4757]: I1216 12:51:33.494608 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 16 12:51:33 crc kubenswrapper[4757]: I1216 12:51:33.520162 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 16 12:51:33 crc kubenswrapper[4757]: I1216 12:51:33.623516 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 16 12:51:33 crc kubenswrapper[4757]: I1216 12:51:33.681631 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 16 12:51:33 crc kubenswrapper[4757]: I1216 12:51:33.724335 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 16 12:51:33 crc kubenswrapper[4757]: I1216 12:51:33.947058 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 16 12:51:33 crc kubenswrapper[4757]: I1216 12:51:33.950347 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 16 12:51:33 crc kubenswrapper[4757]: I1216 12:51:33.968362 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 16 12:51:34 crc kubenswrapper[4757]: I1216 12:51:34.024097 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 16 12:51:34 crc kubenswrapper[4757]: I1216 12:51:34.045055 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 16 12:51:34 crc kubenswrapper[4757]: I1216 12:51:34.197384 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 16 12:51:34 crc kubenswrapper[4757]: I1216 12:51:34.221260 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 16 12:51:34 crc kubenswrapper[4757]: I1216 12:51:34.292311 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 16 12:51:34 crc kubenswrapper[4757]: I1216 12:51:34.416454 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 16 12:51:34 crc kubenswrapper[4757]: I1216 12:51:34.478829 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 16 12:51:34 crc kubenswrapper[4757]: I1216 12:51:34.590791 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 16 12:51:34 crc kubenswrapper[4757]: I1216 12:51:34.651929 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 16 12:51:34 crc kubenswrapper[4757]: I1216 12:51:34.692836 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 16 12:51:34 crc kubenswrapper[4757]: I1216 12:51:34.732228 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 16 12:51:34 crc kubenswrapper[4757]: I1216 12:51:34.759258 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 16 12:51:34 crc kubenswrapper[4757]: I1216 12:51:34.765759 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 16 12:51:34 crc kubenswrapper[4757]: I1216 12:51:34.776189 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 16 12:51:34 crc kubenswrapper[4757]: I1216 12:51:34.784532 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 16 12:51:35 crc kubenswrapper[4757]: I1216 12:51:35.012351 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 16 12:51:35 crc kubenswrapper[4757]: I1216 12:51:35.180343 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 16 12:51:35 crc kubenswrapper[4757]: I1216 12:51:35.200634 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 16 12:51:35 crc kubenswrapper[4757]: I1216 12:51:35.440692 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 16 12:51:35 crc kubenswrapper[4757]: I1216 12:51:35.554447 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 16 12:51:35 crc kubenswrapper[4757]: I1216 12:51:35.612415 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 16 12:51:35 crc kubenswrapper[4757]: I1216 12:51:35.839892 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 16 12:51:36 crc kubenswrapper[4757]: I1216 12:51:36.482683 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 16 12:51:36 crc kubenswrapper[4757]: I1216 12:51:36.499518 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 16 12:51:36 crc kubenswrapper[4757]: I1216 12:51:36.546869 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 16 12:51:36 crc kubenswrapper[4757]: I1216 12:51:36.655887 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 16 12:51:37 crc kubenswrapper[4757]: I1216 12:51:37.010954 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.447475 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.447786 4757 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="fad05903fc52ace047ff7f12cb505827333a3ff7e7aff802b3c3b9f98b8cc4f6" exitCode=137 Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.447822 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3edbdad47c3f29e861bf366f55d123cc041374072cd9817d980c1718ef75d415" Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.459358 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.459431 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.520428 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.520528 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.520566 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.520568 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.520623 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.520627 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.520650 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.520895 4757 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.520919 4757 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.520960 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.520989 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.528864 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.621640 4757 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.622033 4757 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.622049 4757 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.955172 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.955394 4757 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.967231 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.967287 4757 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="ffe71c3e-5187-4082-aa42-df4da79f8061" Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.970857 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 12:51:38 crc kubenswrapper[4757]: I1216 12:51:38.970897 4757 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="ffe71c3e-5187-4082-aa42-df4da79f8061" Dec 16 12:51:39 crc kubenswrapper[4757]: I1216 12:51:39.451855 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 12:51:52 crc kubenswrapper[4757]: I1216 12:51:52.512526 4757 generic.go:334] "Generic (PLEG): container finished" podID="5706c05b-ab36-4ed2-ac86-06146a1bddda" containerID="16838a1f8a7a7bcbdd967839d36df6d30ef8312ac5f3969b68c679221368b9e6" exitCode=0 Dec 16 12:51:52 crc kubenswrapper[4757]: I1216 12:51:52.512649 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" event={"ID":"5706c05b-ab36-4ed2-ac86-06146a1bddda","Type":"ContainerDied","Data":"16838a1f8a7a7bcbdd967839d36df6d30ef8312ac5f3969b68c679221368b9e6"} Dec 16 12:51:52 crc kubenswrapper[4757]: I1216 12:51:52.513895 4757 scope.go:117] "RemoveContainer" containerID="16838a1f8a7a7bcbdd967839d36df6d30ef8312ac5f3969b68c679221368b9e6" Dec 16 12:51:52 crc kubenswrapper[4757]: I1216 12:51:52.575781 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:51:52 crc kubenswrapper[4757]: I1216 12:51:52.576242 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:51:53 crc kubenswrapper[4757]: I1216 12:51:53.518944 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" event={"ID":"5706c05b-ab36-4ed2-ac86-06146a1bddda","Type":"ContainerStarted","Data":"5a774c11e5e3b64c7da7ce4bf6b50785c9ac0003db27591f60e004ac801bbc95"} Dec 16 12:51:53 crc kubenswrapper[4757]: I1216 12:51:53.519721 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:51:53 crc kubenswrapper[4757]: I1216 12:51:53.521089 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:51:54 crc kubenswrapper[4757]: I1216 12:51:54.806063 4757 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 16 12:51:59 crc kubenswrapper[4757]: I1216 12:51:59.100345 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 16 12:52:01 crc kubenswrapper[4757]: I1216 12:52:01.623910 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 16 12:52:03 crc kubenswrapper[4757]: I1216 12:52:03.834553 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 16 12:52:21 crc kubenswrapper[4757]: I1216 12:52:21.181327 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 12:52:21 crc kubenswrapper[4757]: I1216 12:52:21.181958 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 12:52:39 crc kubenswrapper[4757]: I1216 12:52:39.707654 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xwpj8"] Dec 16 12:52:39 crc kubenswrapper[4757]: I1216 12:52:39.708342 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" podUID="df94dff2-af59-42da-be83-0eb6c9aba353" containerName="controller-manager" containerID="cri-o://f422f02a2395140b5d420219e5a662c4af77f7283a3ff08bac33482c406227a9" gracePeriod=30 Dec 16 12:52:39 crc kubenswrapper[4757]: I1216 12:52:39.822467 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647"] Dec 16 12:52:39 crc kubenswrapper[4757]: I1216 12:52:39.823241 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" podUID="4daf4899-3f47-4776-b4ef-a54a340e95f5" containerName="route-controller-manager" containerID="cri-o://785adbfdb794286aaad08d8f21a48e477a85df056cb2ccdcfcce6453a80f5ae4" gracePeriod=30 Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.162759 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.182734 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c9sm\" (UniqueName: \"kubernetes.io/projected/df94dff2-af59-42da-be83-0eb6c9aba353-kube-api-access-8c9sm\") pod \"df94dff2-af59-42da-be83-0eb6c9aba353\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.182777 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df94dff2-af59-42da-be83-0eb6c9aba353-proxy-ca-bundles\") pod \"df94dff2-af59-42da-be83-0eb6c9aba353\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.182921 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df94dff2-af59-42da-be83-0eb6c9aba353-config\") pod \"df94dff2-af59-42da-be83-0eb6c9aba353\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.182944 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df94dff2-af59-42da-be83-0eb6c9aba353-serving-cert\") pod \"df94dff2-af59-42da-be83-0eb6c9aba353\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.182971 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df94dff2-af59-42da-be83-0eb6c9aba353-client-ca\") pod \"df94dff2-af59-42da-be83-0eb6c9aba353\" (UID: \"df94dff2-af59-42da-be83-0eb6c9aba353\") " Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.184675 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df94dff2-af59-42da-be83-0eb6c9aba353-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "df94dff2-af59-42da-be83-0eb6c9aba353" (UID: "df94dff2-af59-42da-be83-0eb6c9aba353"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.185458 4757 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df94dff2-af59-42da-be83-0eb6c9aba353-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.186342 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df94dff2-af59-42da-be83-0eb6c9aba353-config" (OuterVolumeSpecName: "config") pod "df94dff2-af59-42da-be83-0eb6c9aba353" (UID: "df94dff2-af59-42da-be83-0eb6c9aba353"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.187278 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df94dff2-af59-42da-be83-0eb6c9aba353-client-ca" (OuterVolumeSpecName: "client-ca") pod "df94dff2-af59-42da-be83-0eb6c9aba353" (UID: "df94dff2-af59-42da-be83-0eb6c9aba353"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.198655 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df94dff2-af59-42da-be83-0eb6c9aba353-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "df94dff2-af59-42da-be83-0eb6c9aba353" (UID: "df94dff2-af59-42da-be83-0eb6c9aba353"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.206001 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df94dff2-af59-42da-be83-0eb6c9aba353-kube-api-access-8c9sm" (OuterVolumeSpecName: "kube-api-access-8c9sm") pod "df94dff2-af59-42da-be83-0eb6c9aba353" (UID: "df94dff2-af59-42da-be83-0eb6c9aba353"). InnerVolumeSpecName "kube-api-access-8c9sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.241249 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.286493 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4daf4899-3f47-4776-b4ef-a54a340e95f5-client-ca\") pod \"4daf4899-3f47-4776-b4ef-a54a340e95f5\" (UID: \"4daf4899-3f47-4776-b4ef-a54a340e95f5\") " Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.286594 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk5pv\" (UniqueName: \"kubernetes.io/projected/4daf4899-3f47-4776-b4ef-a54a340e95f5-kube-api-access-jk5pv\") pod \"4daf4899-3f47-4776-b4ef-a54a340e95f5\" (UID: \"4daf4899-3f47-4776-b4ef-a54a340e95f5\") " Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.286622 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4daf4899-3f47-4776-b4ef-a54a340e95f5-serving-cert\") pod \"4daf4899-3f47-4776-b4ef-a54a340e95f5\" (UID: \"4daf4899-3f47-4776-b4ef-a54a340e95f5\") " Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.286653 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4daf4899-3f47-4776-b4ef-a54a340e95f5-config\") pod \"4daf4899-3f47-4776-b4ef-a54a340e95f5\" (UID: \"4daf4899-3f47-4776-b4ef-a54a340e95f5\") " Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.286828 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df94dff2-af59-42da-be83-0eb6c9aba353-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.286839 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df94dff2-af59-42da-be83-0eb6c9aba353-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.286848 4757 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df94dff2-af59-42da-be83-0eb6c9aba353-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.286857 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c9sm\" (UniqueName: \"kubernetes.io/projected/df94dff2-af59-42da-be83-0eb6c9aba353-kube-api-access-8c9sm\") on node \"crc\" DevicePath \"\"" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.287279 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4daf4899-3f47-4776-b4ef-a54a340e95f5-config" (OuterVolumeSpecName: "config") pod "4daf4899-3f47-4776-b4ef-a54a340e95f5" (UID: "4daf4899-3f47-4776-b4ef-a54a340e95f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.287270 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4daf4899-3f47-4776-b4ef-a54a340e95f5-client-ca" (OuterVolumeSpecName: "client-ca") pod "4daf4899-3f47-4776-b4ef-a54a340e95f5" (UID: "4daf4899-3f47-4776-b4ef-a54a340e95f5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.289886 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4daf4899-3f47-4776-b4ef-a54a340e95f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4daf4899-3f47-4776-b4ef-a54a340e95f5" (UID: "4daf4899-3f47-4776-b4ef-a54a340e95f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.291335 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4daf4899-3f47-4776-b4ef-a54a340e95f5-kube-api-access-jk5pv" (OuterVolumeSpecName: "kube-api-access-jk5pv") pod "4daf4899-3f47-4776-b4ef-a54a340e95f5" (UID: "4daf4899-3f47-4776-b4ef-a54a340e95f5"). InnerVolumeSpecName "kube-api-access-jk5pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.387504 4757 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4daf4899-3f47-4776-b4ef-a54a340e95f5-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.387548 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk5pv\" (UniqueName: \"kubernetes.io/projected/4daf4899-3f47-4776-b4ef-a54a340e95f5-kube-api-access-jk5pv\") on node \"crc\" DevicePath \"\"" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.387558 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4daf4899-3f47-4776-b4ef-a54a340e95f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.387569 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4daf4899-3f47-4776-b4ef-a54a340e95f5-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.748726 4757 generic.go:334] "Generic (PLEG): container finished" podID="df94dff2-af59-42da-be83-0eb6c9aba353" containerID="f422f02a2395140b5d420219e5a662c4af77f7283a3ff08bac33482c406227a9" exitCode=0 Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.749159 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" event={"ID":"df94dff2-af59-42da-be83-0eb6c9aba353","Type":"ContainerDied","Data":"f422f02a2395140b5d420219e5a662c4af77f7283a3ff08bac33482c406227a9"} Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.749634 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" event={"ID":"df94dff2-af59-42da-be83-0eb6c9aba353","Type":"ContainerDied","Data":"1cadade677b43d6f8031ab7c5a8c091f733492ec996141b25e7232ef9abd34d9"} Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.749200 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xwpj8" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.749730 4757 scope.go:117] "RemoveContainer" containerID="f422f02a2395140b5d420219e5a662c4af77f7283a3ff08bac33482c406227a9" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.750946 4757 generic.go:334] "Generic (PLEG): container finished" podID="4daf4899-3f47-4776-b4ef-a54a340e95f5" containerID="785adbfdb794286aaad08d8f21a48e477a85df056cb2ccdcfcce6453a80f5ae4" exitCode=0 Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.751020 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" event={"ID":"4daf4899-3f47-4776-b4ef-a54a340e95f5","Type":"ContainerDied","Data":"785adbfdb794286aaad08d8f21a48e477a85df056cb2ccdcfcce6453a80f5ae4"} Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.751049 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" event={"ID":"4daf4899-3f47-4776-b4ef-a54a340e95f5","Type":"ContainerDied","Data":"a3a484cc25575431e7f37134e059931086f2dca5c037615e17754e3cc1b1cea4"} Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.751101 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.763815 4757 scope.go:117] "RemoveContainer" containerID="f422f02a2395140b5d420219e5a662c4af77f7283a3ff08bac33482c406227a9" Dec 16 12:52:40 crc kubenswrapper[4757]: E1216 12:52:40.764248 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f422f02a2395140b5d420219e5a662c4af77f7283a3ff08bac33482c406227a9\": container with ID starting with f422f02a2395140b5d420219e5a662c4af77f7283a3ff08bac33482c406227a9 not found: ID does not exist" containerID="f422f02a2395140b5d420219e5a662c4af77f7283a3ff08bac33482c406227a9" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.764305 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f422f02a2395140b5d420219e5a662c4af77f7283a3ff08bac33482c406227a9"} err="failed to get container status \"f422f02a2395140b5d420219e5a662c4af77f7283a3ff08bac33482c406227a9\": rpc error: code = NotFound desc = could not find container \"f422f02a2395140b5d420219e5a662c4af77f7283a3ff08bac33482c406227a9\": container with ID starting with f422f02a2395140b5d420219e5a662c4af77f7283a3ff08bac33482c406227a9 not found: ID does not exist" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.764356 4757 scope.go:117] "RemoveContainer" containerID="785adbfdb794286aaad08d8f21a48e477a85df056cb2ccdcfcce6453a80f5ae4" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.779437 4757 scope.go:117] "RemoveContainer" containerID="785adbfdb794286aaad08d8f21a48e477a85df056cb2ccdcfcce6453a80f5ae4" Dec 16 12:52:40 crc kubenswrapper[4757]: E1216 12:52:40.779817 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"785adbfdb794286aaad08d8f21a48e477a85df056cb2ccdcfcce6453a80f5ae4\": container with ID starting with 785adbfdb794286aaad08d8f21a48e477a85df056cb2ccdcfcce6453a80f5ae4 not found: ID does not exist" containerID="785adbfdb794286aaad08d8f21a48e477a85df056cb2ccdcfcce6453a80f5ae4" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.779878 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"785adbfdb794286aaad08d8f21a48e477a85df056cb2ccdcfcce6453a80f5ae4"} err="failed to get container status \"785adbfdb794286aaad08d8f21a48e477a85df056cb2ccdcfcce6453a80f5ae4\": rpc error: code = NotFound desc = could not find container \"785adbfdb794286aaad08d8f21a48e477a85df056cb2ccdcfcce6453a80f5ae4\": container with ID starting with 785adbfdb794286aaad08d8f21a48e477a85df056cb2ccdcfcce6453a80f5ae4 not found: ID does not exist" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.783326 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647"] Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.788853 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-v9647"] Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.795238 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xwpj8"] Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.799437 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xwpj8"] Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.946674 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm"] Dec 16 12:52:40 crc kubenswrapper[4757]: E1216 12:52:40.947140 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18" containerName="installer" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.947254 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18" containerName="installer" Dec 16 12:52:40 crc kubenswrapper[4757]: E1216 12:52:40.947331 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df94dff2-af59-42da-be83-0eb6c9aba353" containerName="controller-manager" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.947392 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="df94dff2-af59-42da-be83-0eb6c9aba353" containerName="controller-manager" Dec 16 12:52:40 crc kubenswrapper[4757]: E1216 12:52:40.947455 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4daf4899-3f47-4776-b4ef-a54a340e95f5" containerName="route-controller-manager" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.947507 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="4daf4899-3f47-4776-b4ef-a54a340e95f5" containerName="route-controller-manager" Dec 16 12:52:40 crc kubenswrapper[4757]: E1216 12:52:40.947568 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.947624 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.947770 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="8376dda8-bc64-4d9f-9ccf-3ad8a4ef1c18" containerName="installer" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.947856 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.947933 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="df94dff2-af59-42da-be83-0eb6c9aba353" containerName="controller-manager" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.948033 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="4daf4899-3f47-4776-b4ef-a54a340e95f5" containerName="route-controller-manager" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.948519 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.951321 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.951758 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.951967 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.951472 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.952540 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.953515 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.956419 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4daf4899-3f47-4776-b4ef-a54a340e95f5" path="/var/lib/kubelet/pods/4daf4899-3f47-4776-b4ef-a54a340e95f5/volumes" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.957050 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df94dff2-af59-42da-be83-0eb6c9aba353" path="/var/lib/kubelet/pods/df94dff2-af59-42da-be83-0eb6c9aba353/volumes" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.957173 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.970085 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm"] Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.993809 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csvpc\" (UniqueName: \"kubernetes.io/projected/ec969be3-3091-446a-9638-024ed57e190d-kube-api-access-csvpc\") pod \"controller-manager-6fdb9d5c89-sbbrm\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.994099 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec969be3-3091-446a-9638-024ed57e190d-serving-cert\") pod \"controller-manager-6fdb9d5c89-sbbrm\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.994279 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec969be3-3091-446a-9638-024ed57e190d-client-ca\") pod \"controller-manager-6fdb9d5c89-sbbrm\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.994373 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec969be3-3091-446a-9638-024ed57e190d-proxy-ca-bundles\") pod \"controller-manager-6fdb9d5c89-sbbrm\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:40 crc kubenswrapper[4757]: I1216 12:52:40.994492 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec969be3-3091-446a-9638-024ed57e190d-config\") pod \"controller-manager-6fdb9d5c89-sbbrm\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.095853 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec969be3-3091-446a-9638-024ed57e190d-client-ca\") pod \"controller-manager-6fdb9d5c89-sbbrm\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.095913 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec969be3-3091-446a-9638-024ed57e190d-proxy-ca-bundles\") pod \"controller-manager-6fdb9d5c89-sbbrm\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.095979 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec969be3-3091-446a-9638-024ed57e190d-config\") pod \"controller-manager-6fdb9d5c89-sbbrm\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.096072 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csvpc\" (UniqueName: \"kubernetes.io/projected/ec969be3-3091-446a-9638-024ed57e190d-kube-api-access-csvpc\") pod \"controller-manager-6fdb9d5c89-sbbrm\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.096103 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec969be3-3091-446a-9638-024ed57e190d-serving-cert\") pod \"controller-manager-6fdb9d5c89-sbbrm\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.097622 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec969be3-3091-446a-9638-024ed57e190d-config\") pod \"controller-manager-6fdb9d5c89-sbbrm\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.097686 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec969be3-3091-446a-9638-024ed57e190d-proxy-ca-bundles\") pod \"controller-manager-6fdb9d5c89-sbbrm\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.098387 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec969be3-3091-446a-9638-024ed57e190d-client-ca\") pod \"controller-manager-6fdb9d5c89-sbbrm\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.109104 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec969be3-3091-446a-9638-024ed57e190d-serving-cert\") pod \"controller-manager-6fdb9d5c89-sbbrm\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.116318 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csvpc\" (UniqueName: \"kubernetes.io/projected/ec969be3-3091-446a-9638-024ed57e190d-kube-api-access-csvpc\") pod \"controller-manager-6fdb9d5c89-sbbrm\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.263400 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.441556 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm"] Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.770738 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" event={"ID":"ec969be3-3091-446a-9638-024ed57e190d","Type":"ContainerStarted","Data":"f01579e1381f8df4c0824c3778636e27132f1f181531cbc67f2ff8e71d696cfc"} Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.771107 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" event={"ID":"ec969be3-3091-446a-9638-024ed57e190d","Type":"ContainerStarted","Data":"b0b00e35ade3521d52fad9aec645ac6477a5d71c810a248a2a4e3725a09b4c3c"} Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.771132 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.780115 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.790193 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" podStartSLOduration=2.7901761929999997 podStartE2EDuration="2.790176193s" podCreationTimestamp="2025-12-16 12:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:52:41.788170871 +0000 UTC m=+347.215914687" watchObservedRunningTime="2025-12-16 12:52:41.790176193 +0000 UTC m=+347.217919989" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.948926 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8"] Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.949770 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.951977 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.951977 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.952226 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.952323 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.952401 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.952990 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 12:52:41 crc kubenswrapper[4757]: I1216 12:52:41.961517 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8"] Dec 16 12:52:42 crc kubenswrapper[4757]: I1216 12:52:42.009051 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26csb\" (UniqueName: \"kubernetes.io/projected/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-kube-api-access-26csb\") pod \"route-controller-manager-77ff678cd4-szzr8\" (UID: \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\") " pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" Dec 16 12:52:42 crc kubenswrapper[4757]: I1216 12:52:42.009139 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-serving-cert\") pod \"route-controller-manager-77ff678cd4-szzr8\" (UID: \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\") " pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" Dec 16 12:52:42 crc kubenswrapper[4757]: I1216 12:52:42.009200 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-client-ca\") pod \"route-controller-manager-77ff678cd4-szzr8\" (UID: \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\") " pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" Dec 16 12:52:42 crc kubenswrapper[4757]: I1216 12:52:42.009219 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-config\") pod \"route-controller-manager-77ff678cd4-szzr8\" (UID: \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\") " pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" Dec 16 12:52:42 crc kubenswrapper[4757]: I1216 12:52:42.110699 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-serving-cert\") pod \"route-controller-manager-77ff678cd4-szzr8\" (UID: \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\") " pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" Dec 16 12:52:42 crc kubenswrapper[4757]: I1216 12:52:42.110805 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-client-ca\") pod \"route-controller-manager-77ff678cd4-szzr8\" (UID: \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\") " pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" Dec 16 12:52:42 crc kubenswrapper[4757]: I1216 12:52:42.110834 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-config\") pod \"route-controller-manager-77ff678cd4-szzr8\" (UID: \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\") " pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" Dec 16 12:52:42 crc kubenswrapper[4757]: I1216 12:52:42.110889 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26csb\" (UniqueName: \"kubernetes.io/projected/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-kube-api-access-26csb\") pod \"route-controller-manager-77ff678cd4-szzr8\" (UID: \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\") " pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" Dec 16 12:52:42 crc kubenswrapper[4757]: I1216 12:52:42.111764 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-client-ca\") pod \"route-controller-manager-77ff678cd4-szzr8\" (UID: \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\") " pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" Dec 16 12:52:42 crc kubenswrapper[4757]: I1216 12:52:42.112110 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-config\") pod \"route-controller-manager-77ff678cd4-szzr8\" (UID: \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\") " pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" Dec 16 12:52:42 crc kubenswrapper[4757]: I1216 12:52:42.115157 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-serving-cert\") pod \"route-controller-manager-77ff678cd4-szzr8\" (UID: \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\") " pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" Dec 16 12:52:42 crc kubenswrapper[4757]: I1216 12:52:42.131198 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26csb\" (UniqueName: \"kubernetes.io/projected/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-kube-api-access-26csb\") pod \"route-controller-manager-77ff678cd4-szzr8\" (UID: \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\") " pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" Dec 16 12:52:42 crc kubenswrapper[4757]: I1216 12:52:42.267155 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" Dec 16 12:52:42 crc kubenswrapper[4757]: I1216 12:52:42.475541 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8"] Dec 16 12:52:42 crc kubenswrapper[4757]: W1216 12:52:42.484951 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18b3dafa_cb9c_443e_b2b9_0b6a43776e6b.slice/crio-c54b74379eb356e0b6ee3b9e0eac3c876f9ac788cdd0704afa39fd6f6c4f50f8 WatchSource:0}: Error finding container c54b74379eb356e0b6ee3b9e0eac3c876f9ac788cdd0704afa39fd6f6c4f50f8: Status 404 returned error can't find the container with id c54b74379eb356e0b6ee3b9e0eac3c876f9ac788cdd0704afa39fd6f6c4f50f8 Dec 16 12:52:42 crc kubenswrapper[4757]: I1216 12:52:42.779463 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" event={"ID":"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b","Type":"ContainerStarted","Data":"e4e1d1efe3fc181480585fcbc679a7a49af1069362f2aebe04a886a4c40f861e"} Dec 16 12:52:42 crc kubenswrapper[4757]: I1216 12:52:42.779510 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" event={"ID":"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b","Type":"ContainerStarted","Data":"c54b74379eb356e0b6ee3b9e0eac3c876f9ac788cdd0704afa39fd6f6c4f50f8"} Dec 16 12:52:42 crc kubenswrapper[4757]: I1216 12:52:42.795255 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" podStartSLOduration=3.7952372580000002 podStartE2EDuration="3.795237258s" podCreationTimestamp="2025-12-16 12:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:52:42.794699137 +0000 UTC m=+348.222442953" watchObservedRunningTime="2025-12-16 12:52:42.795237258 +0000 UTC m=+348.222981054" Dec 16 12:52:43 crc kubenswrapper[4757]: I1216 12:52:43.784939 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" Dec 16 12:52:43 crc kubenswrapper[4757]: I1216 12:52:43.790963 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" Dec 16 12:52:51 crc kubenswrapper[4757]: I1216 12:52:51.181647 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 12:52:51 crc kubenswrapper[4757]: I1216 12:52:51.182271 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 12:52:59 crc kubenswrapper[4757]: I1216 12:52:59.716135 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8"] Dec 16 12:52:59 crc kubenswrapper[4757]: I1216 12:52:59.717814 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" podUID="18b3dafa-cb9c-443e-b2b9-0b6a43776e6b" containerName="route-controller-manager" containerID="cri-o://e4e1d1efe3fc181480585fcbc679a7a49af1069362f2aebe04a886a4c40f861e" gracePeriod=30 Dec 16 12:52:59 crc kubenswrapper[4757]: I1216 12:52:59.867336 4757 generic.go:334] "Generic (PLEG): container finished" podID="18b3dafa-cb9c-443e-b2b9-0b6a43776e6b" containerID="e4e1d1efe3fc181480585fcbc679a7a49af1069362f2aebe04a886a4c40f861e" exitCode=0 Dec 16 12:52:59 crc kubenswrapper[4757]: I1216 12:52:59.867431 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" event={"ID":"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b","Type":"ContainerDied","Data":"e4e1d1efe3fc181480585fcbc679a7a49af1069362f2aebe04a886a4c40f861e"} Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.155197 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.242667 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26csb\" (UniqueName: \"kubernetes.io/projected/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-kube-api-access-26csb\") pod \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\" (UID: \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\") " Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.242718 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-serving-cert\") pod \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\" (UID: \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\") " Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.242745 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-client-ca\") pod \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\" (UID: \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\") " Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.242793 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-config\") pod \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\" (UID: \"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b\") " Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.243681 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-client-ca" (OuterVolumeSpecName: "client-ca") pod "18b3dafa-cb9c-443e-b2b9-0b6a43776e6b" (UID: "18b3dafa-cb9c-443e-b2b9-0b6a43776e6b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.243798 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-config" (OuterVolumeSpecName: "config") pod "18b3dafa-cb9c-443e-b2b9-0b6a43776e6b" (UID: "18b3dafa-cb9c-443e-b2b9-0b6a43776e6b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.248394 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-kube-api-access-26csb" (OuterVolumeSpecName: "kube-api-access-26csb") pod "18b3dafa-cb9c-443e-b2b9-0b6a43776e6b" (UID: "18b3dafa-cb9c-443e-b2b9-0b6a43776e6b"). InnerVolumeSpecName "kube-api-access-26csb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.248423 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "18b3dafa-cb9c-443e-b2b9-0b6a43776e6b" (UID: "18b3dafa-cb9c-443e-b2b9-0b6a43776e6b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.344488 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26csb\" (UniqueName: \"kubernetes.io/projected/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-kube-api-access-26csb\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.344523 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.344532 4757 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.344542 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.873788 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" event={"ID":"18b3dafa-cb9c-443e-b2b9-0b6a43776e6b","Type":"ContainerDied","Data":"c54b74379eb356e0b6ee3b9e0eac3c876f9ac788cdd0704afa39fd6f6c4f50f8"} Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.873844 4757 scope.go:117] "RemoveContainer" containerID="e4e1d1efe3fc181480585fcbc679a7a49af1069362f2aebe04a886a4c40f861e" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.873867 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.904968 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8"] Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.908965 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77ff678cd4-szzr8"] Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.957575 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18b3dafa-cb9c-443e-b2b9-0b6a43776e6b" path="/var/lib/kubelet/pods/18b3dafa-cb9c-443e-b2b9-0b6a43776e6b/volumes" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.967721 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs"] Dec 16 12:53:00 crc kubenswrapper[4757]: E1216 12:53:00.968042 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b3dafa-cb9c-443e-b2b9-0b6a43776e6b" containerName="route-controller-manager" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.968067 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b3dafa-cb9c-443e-b2b9-0b6a43776e6b" containerName="route-controller-manager" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.968199 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b3dafa-cb9c-443e-b2b9-0b6a43776e6b" containerName="route-controller-manager" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.968708 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.973075 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.973796 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.974038 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.974104 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.975427 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.976808 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 12:53:00 crc kubenswrapper[4757]: I1216 12:53:00.990352 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs"] Dec 16 12:53:01 crc kubenswrapper[4757]: I1216 12:53:01.053588 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2147e7b-5606-4bb9-9c42-6e7075b471ae-serving-cert\") pod \"route-controller-manager-79b57c6f4c-4zqxs\" (UID: \"e2147e7b-5606-4bb9-9c42-6e7075b471ae\") " pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" Dec 16 12:53:01 crc kubenswrapper[4757]: I1216 12:53:01.053678 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2147e7b-5606-4bb9-9c42-6e7075b471ae-config\") pod \"route-controller-manager-79b57c6f4c-4zqxs\" (UID: \"e2147e7b-5606-4bb9-9c42-6e7075b471ae\") " pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" Dec 16 12:53:01 crc kubenswrapper[4757]: I1216 12:53:01.054056 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf4dc\" (UniqueName: \"kubernetes.io/projected/e2147e7b-5606-4bb9-9c42-6e7075b471ae-kube-api-access-hf4dc\") pod \"route-controller-manager-79b57c6f4c-4zqxs\" (UID: \"e2147e7b-5606-4bb9-9c42-6e7075b471ae\") " pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" Dec 16 12:53:01 crc kubenswrapper[4757]: I1216 12:53:01.054155 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e2147e7b-5606-4bb9-9c42-6e7075b471ae-client-ca\") pod \"route-controller-manager-79b57c6f4c-4zqxs\" (UID: \"e2147e7b-5606-4bb9-9c42-6e7075b471ae\") " pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" Dec 16 12:53:01 crc kubenswrapper[4757]: I1216 12:53:01.155644 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2147e7b-5606-4bb9-9c42-6e7075b471ae-serving-cert\") pod \"route-controller-manager-79b57c6f4c-4zqxs\" (UID: \"e2147e7b-5606-4bb9-9c42-6e7075b471ae\") " pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" Dec 16 12:53:01 crc kubenswrapper[4757]: I1216 12:53:01.155723 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2147e7b-5606-4bb9-9c42-6e7075b471ae-config\") pod \"route-controller-manager-79b57c6f4c-4zqxs\" (UID: \"e2147e7b-5606-4bb9-9c42-6e7075b471ae\") " pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" Dec 16 12:53:01 crc kubenswrapper[4757]: I1216 12:53:01.155771 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf4dc\" (UniqueName: \"kubernetes.io/projected/e2147e7b-5606-4bb9-9c42-6e7075b471ae-kube-api-access-hf4dc\") pod \"route-controller-manager-79b57c6f4c-4zqxs\" (UID: \"e2147e7b-5606-4bb9-9c42-6e7075b471ae\") " pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" Dec 16 12:53:01 crc kubenswrapper[4757]: I1216 12:53:01.155798 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e2147e7b-5606-4bb9-9c42-6e7075b471ae-client-ca\") pod \"route-controller-manager-79b57c6f4c-4zqxs\" (UID: \"e2147e7b-5606-4bb9-9c42-6e7075b471ae\") " pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" Dec 16 12:53:01 crc kubenswrapper[4757]: I1216 12:53:01.156958 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e2147e7b-5606-4bb9-9c42-6e7075b471ae-client-ca\") pod \"route-controller-manager-79b57c6f4c-4zqxs\" (UID: \"e2147e7b-5606-4bb9-9c42-6e7075b471ae\") " pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" Dec 16 12:53:01 crc kubenswrapper[4757]: I1216 12:53:01.157174 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2147e7b-5606-4bb9-9c42-6e7075b471ae-config\") pod \"route-controller-manager-79b57c6f4c-4zqxs\" (UID: \"e2147e7b-5606-4bb9-9c42-6e7075b471ae\") " pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" Dec 16 12:53:01 crc kubenswrapper[4757]: I1216 12:53:01.160687 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2147e7b-5606-4bb9-9c42-6e7075b471ae-serving-cert\") pod \"route-controller-manager-79b57c6f4c-4zqxs\" (UID: \"e2147e7b-5606-4bb9-9c42-6e7075b471ae\") " pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" Dec 16 12:53:01 crc kubenswrapper[4757]: I1216 12:53:01.198113 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf4dc\" (UniqueName: \"kubernetes.io/projected/e2147e7b-5606-4bb9-9c42-6e7075b471ae-kube-api-access-hf4dc\") pod \"route-controller-manager-79b57c6f4c-4zqxs\" (UID: \"e2147e7b-5606-4bb9-9c42-6e7075b471ae\") " pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" Dec 16 12:53:01 crc kubenswrapper[4757]: I1216 12:53:01.282468 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" Dec 16 12:53:01 crc kubenswrapper[4757]: I1216 12:53:01.772673 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs"] Dec 16 12:53:01 crc kubenswrapper[4757]: W1216 12:53:01.777305 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2147e7b_5606_4bb9_9c42_6e7075b471ae.slice/crio-342d4c94837d1f208d672d6fe2c8f5dc43dd74323f7dc58c97448efa2eaaac4b WatchSource:0}: Error finding container 342d4c94837d1f208d672d6fe2c8f5dc43dd74323f7dc58c97448efa2eaaac4b: Status 404 returned error can't find the container with id 342d4c94837d1f208d672d6fe2c8f5dc43dd74323f7dc58c97448efa2eaaac4b Dec 16 12:53:01 crc kubenswrapper[4757]: I1216 12:53:01.882821 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" event={"ID":"e2147e7b-5606-4bb9-9c42-6e7075b471ae","Type":"ContainerStarted","Data":"342d4c94837d1f208d672d6fe2c8f5dc43dd74323f7dc58c97448efa2eaaac4b"} Dec 16 12:53:02 crc kubenswrapper[4757]: I1216 12:53:02.889485 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" event={"ID":"e2147e7b-5606-4bb9-9c42-6e7075b471ae","Type":"ContainerStarted","Data":"6d1cadf9a3d2b27960af2c3263ae6a029623cebf45b34cf5e55fd19681ab90e8"} Dec 16 12:53:02 crc kubenswrapper[4757]: I1216 12:53:02.890041 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" Dec 16 12:53:02 crc kubenswrapper[4757]: I1216 12:53:02.896874 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" Dec 16 12:53:02 crc kubenswrapper[4757]: I1216 12:53:02.908998 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79b57c6f4c-4zqxs" podStartSLOduration=3.9089772529999998 podStartE2EDuration="3.908977253s" podCreationTimestamp="2025-12-16 12:52:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:53:02.907460991 +0000 UTC m=+368.335204787" watchObservedRunningTime="2025-12-16 12:53:02.908977253 +0000 UTC m=+368.336721059" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.615804 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4zph4"] Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.616939 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.632951 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4zph4"] Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.711513 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eff411fd-db21-49c0-b877-3b85aa196b2c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.711603 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eff411fd-db21-49c0-b877-3b85aa196b2c-registry-tls\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.711667 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eff411fd-db21-49c0-b877-3b85aa196b2c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.711699 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.711733 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eff411fd-db21-49c0-b877-3b85aa196b2c-trusted-ca\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.711767 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eff411fd-db21-49c0-b877-3b85aa196b2c-registry-certificates\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.711806 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7bqt\" (UniqueName: \"kubernetes.io/projected/eff411fd-db21-49c0-b877-3b85aa196b2c-kube-api-access-l7bqt\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.711917 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eff411fd-db21-49c0-b877-3b85aa196b2c-bound-sa-token\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.736646 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.813660 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eff411fd-db21-49c0-b877-3b85aa196b2c-registry-tls\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.813720 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eff411fd-db21-49c0-b877-3b85aa196b2c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.813755 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eff411fd-db21-49c0-b877-3b85aa196b2c-trusted-ca\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.813771 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7bqt\" (UniqueName: \"kubernetes.io/projected/eff411fd-db21-49c0-b877-3b85aa196b2c-kube-api-access-l7bqt\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.813787 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eff411fd-db21-49c0-b877-3b85aa196b2c-registry-certificates\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.813810 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eff411fd-db21-49c0-b877-3b85aa196b2c-bound-sa-token\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.813841 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eff411fd-db21-49c0-b877-3b85aa196b2c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.814280 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eff411fd-db21-49c0-b877-3b85aa196b2c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.815653 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eff411fd-db21-49c0-b877-3b85aa196b2c-trusted-ca\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.816808 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eff411fd-db21-49c0-b877-3b85aa196b2c-registry-certificates\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.825922 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eff411fd-db21-49c0-b877-3b85aa196b2c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.825939 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eff411fd-db21-49c0-b877-3b85aa196b2c-registry-tls\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.832879 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7bqt\" (UniqueName: \"kubernetes.io/projected/eff411fd-db21-49c0-b877-3b85aa196b2c-kube-api-access-l7bqt\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.836287 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eff411fd-db21-49c0-b877-3b85aa196b2c-bound-sa-token\") pod \"image-registry-66df7c8f76-4zph4\" (UID: \"eff411fd-db21-49c0-b877-3b85aa196b2c\") " pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:05 crc kubenswrapper[4757]: I1216 12:53:05.943640 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:06 crc kubenswrapper[4757]: I1216 12:53:06.421586 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4zph4"] Dec 16 12:53:06 crc kubenswrapper[4757]: I1216 12:53:06.915129 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" event={"ID":"eff411fd-db21-49c0-b877-3b85aa196b2c","Type":"ContainerStarted","Data":"de41b945956ca1dae94e81d47f31cfdd7b2e78ed56671fca94d163132cdbc613"} Dec 16 12:53:06 crc kubenswrapper[4757]: I1216 12:53:06.915383 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" event={"ID":"eff411fd-db21-49c0-b877-3b85aa196b2c","Type":"ContainerStarted","Data":"4074ccad0adc0ce544b0d7b1fafc33ca25fc043e51d500dce20f8f5e0b5fc815"} Dec 16 12:53:06 crc kubenswrapper[4757]: I1216 12:53:06.916162 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:06 crc kubenswrapper[4757]: I1216 12:53:06.940292 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" podStartSLOduration=1.9402700990000001 podStartE2EDuration="1.940270099s" podCreationTimestamp="2025-12-16 12:53:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:53:06.93610566 +0000 UTC m=+372.363849466" watchObservedRunningTime="2025-12-16 12:53:06.940270099 +0000 UTC m=+372.368013905" Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.336103 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqtlh"] Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.336689 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vqtlh" podUID="7320d121-c9e6-4af2-ad14-4db89ea38a9e" containerName="registry-server" containerID="cri-o://b686f8f19ee15982c905ef93dcb240a55e0e5ad12dc8f291031646e7cefdee41" gracePeriod=30 Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.342277 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7ppdm"] Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.342545 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7ppdm" podUID="c8ab79c2-762d-4773-ae6e-6e92acdf4508" containerName="registry-server" containerID="cri-o://84efb7b8f5531f8ba3ac8720a7a78e43fcb4c8b3a3c4b48062170d647cde0097" gracePeriod=30 Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.360383 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-crbcx"] Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.360603 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" podUID="5706c05b-ab36-4ed2-ac86-06146a1bddda" containerName="marketplace-operator" containerID="cri-o://5a774c11e5e3b64c7da7ce4bf6b50785c9ac0003db27591f60e004ac801bbc95" gracePeriod=30 Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.369703 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krvn2"] Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.369985 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-krvn2" podUID="17e402cb-44b0-4232-8671-b7db09c8e9b1" containerName="registry-server" containerID="cri-o://c87f661a3dd474d83b02f52699d3835e75e1fe68468d2ba07f74f28ea2bc0af7" gracePeriod=30 Dec 16 12:53:08 crc kubenswrapper[4757]: E1216 12:53:08.375146 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b686f8f19ee15982c905ef93dcb240a55e0e5ad12dc8f291031646e7cefdee41" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 12:53:08 crc kubenswrapper[4757]: E1216 12:53:08.379251 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b686f8f19ee15982c905ef93dcb240a55e0e5ad12dc8f291031646e7cefdee41 is running failed: container process not found" containerID="b686f8f19ee15982c905ef93dcb240a55e0e5ad12dc8f291031646e7cefdee41" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 12:53:08 crc kubenswrapper[4757]: E1216 12:53:08.381106 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b686f8f19ee15982c905ef93dcb240a55e0e5ad12dc8f291031646e7cefdee41 is running failed: container process not found" containerID="b686f8f19ee15982c905ef93dcb240a55e0e5ad12dc8f291031646e7cefdee41" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 12:53:08 crc kubenswrapper[4757]: E1216 12:53:08.381183 4757 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b686f8f19ee15982c905ef93dcb240a55e0e5ad12dc8f291031646e7cefdee41 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-vqtlh" podUID="7320d121-c9e6-4af2-ad14-4db89ea38a9e" containerName="registry-server" Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.389612 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smj7p"] Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.390209 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-smj7p" podUID="7c067ef6-5957-4cfd-be96-788f4236d990" containerName="registry-server" containerID="cri-o://ac640ed7caadf3e8677469672278695a052c560f734ea3d2cf094717a3eef02f" gracePeriod=30 Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.402167 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z52vp"] Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.403443 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z52vp" Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.411522 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z52vp"] Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.453201 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b40bf055-8b99-4c86-9e45-ed2253aa09a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z52vp\" (UID: \"b40bf055-8b99-4c86-9e45-ed2253aa09a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-z52vp" Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.453481 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6zpl\" (UniqueName: \"kubernetes.io/projected/b40bf055-8b99-4c86-9e45-ed2253aa09a1-kube-api-access-c6zpl\") pod \"marketplace-operator-79b997595-z52vp\" (UID: \"b40bf055-8b99-4c86-9e45-ed2253aa09a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-z52vp" Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.453672 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b40bf055-8b99-4c86-9e45-ed2253aa09a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z52vp\" (UID: \"b40bf055-8b99-4c86-9e45-ed2253aa09a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-z52vp" Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.555509 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b40bf055-8b99-4c86-9e45-ed2253aa09a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z52vp\" (UID: \"b40bf055-8b99-4c86-9e45-ed2253aa09a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-z52vp" Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.555548 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6zpl\" (UniqueName: \"kubernetes.io/projected/b40bf055-8b99-4c86-9e45-ed2253aa09a1-kube-api-access-c6zpl\") pod \"marketplace-operator-79b997595-z52vp\" (UID: \"b40bf055-8b99-4c86-9e45-ed2253aa09a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-z52vp" Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.555569 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b40bf055-8b99-4c86-9e45-ed2253aa09a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z52vp\" (UID: \"b40bf055-8b99-4c86-9e45-ed2253aa09a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-z52vp" Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.563343 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b40bf055-8b99-4c86-9e45-ed2253aa09a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z52vp\" (UID: \"b40bf055-8b99-4c86-9e45-ed2253aa09a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-z52vp" Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.564803 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b40bf055-8b99-4c86-9e45-ed2253aa09a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z52vp\" (UID: \"b40bf055-8b99-4c86-9e45-ed2253aa09a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-z52vp" Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.574775 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6zpl\" (UniqueName: \"kubernetes.io/projected/b40bf055-8b99-4c86-9e45-ed2253aa09a1-kube-api-access-c6zpl\") pod \"marketplace-operator-79b997595-z52vp\" (UID: \"b40bf055-8b99-4c86-9e45-ed2253aa09a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-z52vp" Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.731698 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z52vp" Dec 16 12:53:08 crc kubenswrapper[4757]: E1216 12:53:08.833339 4757 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8ab79c2_762d_4773_ae6e_6e92acdf4508.slice/crio-conmon-84efb7b8f5531f8ba3ac8720a7a78e43fcb4c8b3a3c4b48062170d647cde0097.scope\": RecentStats: unable to find data in memory cache]" Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.855110 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7ppdm" Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.932850 4757 generic.go:334] "Generic (PLEG): container finished" podID="17e402cb-44b0-4232-8671-b7db09c8e9b1" containerID="c87f661a3dd474d83b02f52699d3835e75e1fe68468d2ba07f74f28ea2bc0af7" exitCode=0 Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.932905 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krvn2" event={"ID":"17e402cb-44b0-4232-8671-b7db09c8e9b1","Type":"ContainerDied","Data":"c87f661a3dd474d83b02f52699d3835e75e1fe68468d2ba07f74f28ea2bc0af7"} Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.934152 4757 generic.go:334] "Generic (PLEG): container finished" podID="7320d121-c9e6-4af2-ad14-4db89ea38a9e" containerID="b686f8f19ee15982c905ef93dcb240a55e0e5ad12dc8f291031646e7cefdee41" exitCode=0 Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.934188 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqtlh" event={"ID":"7320d121-c9e6-4af2-ad14-4db89ea38a9e","Type":"ContainerDied","Data":"b686f8f19ee15982c905ef93dcb240a55e0e5ad12dc8f291031646e7cefdee41"} Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.936403 4757 generic.go:334] "Generic (PLEG): container finished" podID="7c067ef6-5957-4cfd-be96-788f4236d990" containerID="ac640ed7caadf3e8677469672278695a052c560f734ea3d2cf094717a3eef02f" exitCode=0 Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.936445 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smj7p" event={"ID":"7c067ef6-5957-4cfd-be96-788f4236d990","Type":"ContainerDied","Data":"ac640ed7caadf3e8677469672278695a052c560f734ea3d2cf094717a3eef02f"} Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.937884 4757 generic.go:334] "Generic (PLEG): container finished" podID="5706c05b-ab36-4ed2-ac86-06146a1bddda" containerID="5a774c11e5e3b64c7da7ce4bf6b50785c9ac0003db27591f60e004ac801bbc95" exitCode=0 Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.937925 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" event={"ID":"5706c05b-ab36-4ed2-ac86-06146a1bddda","Type":"ContainerDied","Data":"5a774c11e5e3b64c7da7ce4bf6b50785c9ac0003db27591f60e004ac801bbc95"} Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.937946 4757 scope.go:117] "RemoveContainer" containerID="16838a1f8a7a7bcbdd967839d36df6d30ef8312ac5f3969b68c679221368b9e6" Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.941442 4757 generic.go:334] "Generic (PLEG): container finished" podID="c8ab79c2-762d-4773-ae6e-6e92acdf4508" containerID="84efb7b8f5531f8ba3ac8720a7a78e43fcb4c8b3a3c4b48062170d647cde0097" exitCode=0 Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.941847 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7ppdm" Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.942452 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ppdm" event={"ID":"c8ab79c2-762d-4773-ae6e-6e92acdf4508","Type":"ContainerDied","Data":"84efb7b8f5531f8ba3ac8720a7a78e43fcb4c8b3a3c4b48062170d647cde0097"} Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.942540 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ppdm" event={"ID":"c8ab79c2-762d-4773-ae6e-6e92acdf4508","Type":"ContainerDied","Data":"95f8b5349b38ecc911f7e9c9934ea7fb5172cc619213ace00f8a0e52952b7d06"} Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.959587 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq7jn\" (UniqueName: \"kubernetes.io/projected/c8ab79c2-762d-4773-ae6e-6e92acdf4508-kube-api-access-cq7jn\") pod \"c8ab79c2-762d-4773-ae6e-6e92acdf4508\" (UID: \"c8ab79c2-762d-4773-ae6e-6e92acdf4508\") " Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.959637 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ab79c2-762d-4773-ae6e-6e92acdf4508-utilities\") pod \"c8ab79c2-762d-4773-ae6e-6e92acdf4508\" (UID: \"c8ab79c2-762d-4773-ae6e-6e92acdf4508\") " Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.959711 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ab79c2-762d-4773-ae6e-6e92acdf4508-catalog-content\") pod \"c8ab79c2-762d-4773-ae6e-6e92acdf4508\" (UID: \"c8ab79c2-762d-4773-ae6e-6e92acdf4508\") " Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.963450 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8ab79c2-762d-4773-ae6e-6e92acdf4508-utilities" (OuterVolumeSpecName: "utilities") pod "c8ab79c2-762d-4773-ae6e-6e92acdf4508" (UID: "c8ab79c2-762d-4773-ae6e-6e92acdf4508"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:53:08 crc kubenswrapper[4757]: I1216 12:53:08.970470 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8ab79c2-762d-4773-ae6e-6e92acdf4508-kube-api-access-cq7jn" (OuterVolumeSpecName: "kube-api-access-cq7jn") pod "c8ab79c2-762d-4773-ae6e-6e92acdf4508" (UID: "c8ab79c2-762d-4773-ae6e-6e92acdf4508"). InnerVolumeSpecName "kube-api-access-cq7jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.029738 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8ab79c2-762d-4773-ae6e-6e92acdf4508-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8ab79c2-762d-4773-ae6e-6e92acdf4508" (UID: "c8ab79c2-762d-4773-ae6e-6e92acdf4508"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.059164 4757 scope.go:117] "RemoveContainer" containerID="84efb7b8f5531f8ba3ac8720a7a78e43fcb4c8b3a3c4b48062170d647cde0097" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.061420 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ab79c2-762d-4773-ae6e-6e92acdf4508-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.061438 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq7jn\" (UniqueName: \"kubernetes.io/projected/c8ab79c2-762d-4773-ae6e-6e92acdf4508-kube-api-access-cq7jn\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.061451 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ab79c2-762d-4773-ae6e-6e92acdf4508-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.077692 4757 scope.go:117] "RemoveContainer" containerID="adc7ebc29e7848aac8c71c8162e81915ca4e568fc87a1bad8ceae8bc5a8f4e5b" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.098896 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krvn2" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.117328 4757 scope.go:117] "RemoveContainer" containerID="75489ef8f8e1fcdd8e0568d9e53afc31c17552e0d65055379fbc048ef47347b6" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.152621 4757 scope.go:117] "RemoveContainer" containerID="84efb7b8f5531f8ba3ac8720a7a78e43fcb4c8b3a3c4b48062170d647cde0097" Dec 16 12:53:09 crc kubenswrapper[4757]: E1216 12:53:09.153220 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84efb7b8f5531f8ba3ac8720a7a78e43fcb4c8b3a3c4b48062170d647cde0097\": container with ID starting with 84efb7b8f5531f8ba3ac8720a7a78e43fcb4c8b3a3c4b48062170d647cde0097 not found: ID does not exist" containerID="84efb7b8f5531f8ba3ac8720a7a78e43fcb4c8b3a3c4b48062170d647cde0097" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.153297 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84efb7b8f5531f8ba3ac8720a7a78e43fcb4c8b3a3c4b48062170d647cde0097"} err="failed to get container status \"84efb7b8f5531f8ba3ac8720a7a78e43fcb4c8b3a3c4b48062170d647cde0097\": rpc error: code = NotFound desc = could not find container \"84efb7b8f5531f8ba3ac8720a7a78e43fcb4c8b3a3c4b48062170d647cde0097\": container with ID starting with 84efb7b8f5531f8ba3ac8720a7a78e43fcb4c8b3a3c4b48062170d647cde0097 not found: ID does not exist" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.153350 4757 scope.go:117] "RemoveContainer" containerID="adc7ebc29e7848aac8c71c8162e81915ca4e568fc87a1bad8ceae8bc5a8f4e5b" Dec 16 12:53:09 crc kubenswrapper[4757]: E1216 12:53:09.155519 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adc7ebc29e7848aac8c71c8162e81915ca4e568fc87a1bad8ceae8bc5a8f4e5b\": container with ID starting with adc7ebc29e7848aac8c71c8162e81915ca4e568fc87a1bad8ceae8bc5a8f4e5b not found: ID does not exist" containerID="adc7ebc29e7848aac8c71c8162e81915ca4e568fc87a1bad8ceae8bc5a8f4e5b" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.155550 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adc7ebc29e7848aac8c71c8162e81915ca4e568fc87a1bad8ceae8bc5a8f4e5b"} err="failed to get container status \"adc7ebc29e7848aac8c71c8162e81915ca4e568fc87a1bad8ceae8bc5a8f4e5b\": rpc error: code = NotFound desc = could not find container \"adc7ebc29e7848aac8c71c8162e81915ca4e568fc87a1bad8ceae8bc5a8f4e5b\": container with ID starting with adc7ebc29e7848aac8c71c8162e81915ca4e568fc87a1bad8ceae8bc5a8f4e5b not found: ID does not exist" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.155569 4757 scope.go:117] "RemoveContainer" containerID="75489ef8f8e1fcdd8e0568d9e53afc31c17552e0d65055379fbc048ef47347b6" Dec 16 12:53:09 crc kubenswrapper[4757]: E1216 12:53:09.157096 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75489ef8f8e1fcdd8e0568d9e53afc31c17552e0d65055379fbc048ef47347b6\": container with ID starting with 75489ef8f8e1fcdd8e0568d9e53afc31c17552e0d65055379fbc048ef47347b6 not found: ID does not exist" containerID="75489ef8f8e1fcdd8e0568d9e53afc31c17552e0d65055379fbc048ef47347b6" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.157130 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75489ef8f8e1fcdd8e0568d9e53afc31c17552e0d65055379fbc048ef47347b6"} err="failed to get container status \"75489ef8f8e1fcdd8e0568d9e53afc31c17552e0d65055379fbc048ef47347b6\": rpc error: code = NotFound desc = could not find container \"75489ef8f8e1fcdd8e0568d9e53afc31c17552e0d65055379fbc048ef47347b6\": container with ID starting with 75489ef8f8e1fcdd8e0568d9e53afc31c17552e0d65055379fbc048ef47347b6 not found: ID does not exist" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.165830 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smj7p" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.176798 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqtlh" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.177908 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.263525 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17e402cb-44b0-4232-8671-b7db09c8e9b1-utilities\") pod \"17e402cb-44b0-4232-8671-b7db09c8e9b1\" (UID: \"17e402cb-44b0-4232-8671-b7db09c8e9b1\") " Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.263845 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17e402cb-44b0-4232-8671-b7db09c8e9b1-catalog-content\") pod \"17e402cb-44b0-4232-8671-b7db09c8e9b1\" (UID: \"17e402cb-44b0-4232-8671-b7db09c8e9b1\") " Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.263976 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx48j\" (UniqueName: \"kubernetes.io/projected/7c067ef6-5957-4cfd-be96-788f4236d990-kube-api-access-xx48j\") pod \"7c067ef6-5957-4cfd-be96-788f4236d990\" (UID: \"7c067ef6-5957-4cfd-be96-788f4236d990\") " Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.264110 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c067ef6-5957-4cfd-be96-788f4236d990-catalog-content\") pod \"7c067ef6-5957-4cfd-be96-788f4236d990\" (UID: \"7c067ef6-5957-4cfd-be96-788f4236d990\") " Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.264253 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lngbs\" (UniqueName: \"kubernetes.io/projected/17e402cb-44b0-4232-8671-b7db09c8e9b1-kube-api-access-lngbs\") pod \"17e402cb-44b0-4232-8671-b7db09c8e9b1\" (UID: \"17e402cb-44b0-4232-8671-b7db09c8e9b1\") " Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.264390 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c067ef6-5957-4cfd-be96-788f4236d990-utilities\") pod \"7c067ef6-5957-4cfd-be96-788f4236d990\" (UID: \"7c067ef6-5957-4cfd-be96-788f4236d990\") " Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.266283 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c067ef6-5957-4cfd-be96-788f4236d990-utilities" (OuterVolumeSpecName: "utilities") pod "7c067ef6-5957-4cfd-be96-788f4236d990" (UID: "7c067ef6-5957-4cfd-be96-788f4236d990"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.274955 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17e402cb-44b0-4232-8671-b7db09c8e9b1-kube-api-access-lngbs" (OuterVolumeSpecName: "kube-api-access-lngbs") pod "17e402cb-44b0-4232-8671-b7db09c8e9b1" (UID: "17e402cb-44b0-4232-8671-b7db09c8e9b1"). InnerVolumeSpecName "kube-api-access-lngbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.275076 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17e402cb-44b0-4232-8671-b7db09c8e9b1-utilities" (OuterVolumeSpecName: "utilities") pod "17e402cb-44b0-4232-8671-b7db09c8e9b1" (UID: "17e402cb-44b0-4232-8671-b7db09c8e9b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.276410 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7ppdm"] Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.279448 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c067ef6-5957-4cfd-be96-788f4236d990-kube-api-access-xx48j" (OuterVolumeSpecName: "kube-api-access-xx48j") pod "7c067ef6-5957-4cfd-be96-788f4236d990" (UID: "7c067ef6-5957-4cfd-be96-788f4236d990"). InnerVolumeSpecName "kube-api-access-xx48j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.280482 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7ppdm"] Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.315211 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17e402cb-44b0-4232-8671-b7db09c8e9b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17e402cb-44b0-4232-8671-b7db09c8e9b1" (UID: "17e402cb-44b0-4232-8671-b7db09c8e9b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.365981 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7320d121-c9e6-4af2-ad14-4db89ea38a9e-catalog-content\") pod \"7320d121-c9e6-4af2-ad14-4db89ea38a9e\" (UID: \"7320d121-c9e6-4af2-ad14-4db89ea38a9e\") " Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.366043 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5706c05b-ab36-4ed2-ac86-06146a1bddda-marketplace-operator-metrics\") pod \"5706c05b-ab36-4ed2-ac86-06146a1bddda\" (UID: \"5706c05b-ab36-4ed2-ac86-06146a1bddda\") " Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.366287 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5706c05b-ab36-4ed2-ac86-06146a1bddda-marketplace-trusted-ca\") pod \"5706c05b-ab36-4ed2-ac86-06146a1bddda\" (UID: \"5706c05b-ab36-4ed2-ac86-06146a1bddda\") " Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.366381 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7320d121-c9e6-4af2-ad14-4db89ea38a9e-utilities\") pod \"7320d121-c9e6-4af2-ad14-4db89ea38a9e\" (UID: \"7320d121-c9e6-4af2-ad14-4db89ea38a9e\") " Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.366449 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjt2g\" (UniqueName: \"kubernetes.io/projected/7320d121-c9e6-4af2-ad14-4db89ea38a9e-kube-api-access-pjt2g\") pod \"7320d121-c9e6-4af2-ad14-4db89ea38a9e\" (UID: \"7320d121-c9e6-4af2-ad14-4db89ea38a9e\") " Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.366482 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj52f\" (UniqueName: \"kubernetes.io/projected/5706c05b-ab36-4ed2-ac86-06146a1bddda-kube-api-access-hj52f\") pod \"5706c05b-ab36-4ed2-ac86-06146a1bddda\" (UID: \"5706c05b-ab36-4ed2-ac86-06146a1bddda\") " Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.366714 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17e402cb-44b0-4232-8671-b7db09c8e9b1-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.366728 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17e402cb-44b0-4232-8671-b7db09c8e9b1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.366740 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx48j\" (UniqueName: \"kubernetes.io/projected/7c067ef6-5957-4cfd-be96-788f4236d990-kube-api-access-xx48j\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.366751 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lngbs\" (UniqueName: \"kubernetes.io/projected/17e402cb-44b0-4232-8671-b7db09c8e9b1-kube-api-access-lngbs\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.366785 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c067ef6-5957-4cfd-be96-788f4236d990-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.368322 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7320d121-c9e6-4af2-ad14-4db89ea38a9e-utilities" (OuterVolumeSpecName: "utilities") pod "7320d121-c9e6-4af2-ad14-4db89ea38a9e" (UID: "7320d121-c9e6-4af2-ad14-4db89ea38a9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.368363 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5706c05b-ab36-4ed2-ac86-06146a1bddda-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "5706c05b-ab36-4ed2-ac86-06146a1bddda" (UID: "5706c05b-ab36-4ed2-ac86-06146a1bddda"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.402335 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7320d121-c9e6-4af2-ad14-4db89ea38a9e-kube-api-access-pjt2g" (OuterVolumeSpecName: "kube-api-access-pjt2g") pod "7320d121-c9e6-4af2-ad14-4db89ea38a9e" (UID: "7320d121-c9e6-4af2-ad14-4db89ea38a9e"). InnerVolumeSpecName "kube-api-access-pjt2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.405467 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5706c05b-ab36-4ed2-ac86-06146a1bddda-kube-api-access-hj52f" (OuterVolumeSpecName: "kube-api-access-hj52f") pod "5706c05b-ab36-4ed2-ac86-06146a1bddda" (UID: "5706c05b-ab36-4ed2-ac86-06146a1bddda"). InnerVolumeSpecName "kube-api-access-hj52f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.408722 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5706c05b-ab36-4ed2-ac86-06146a1bddda-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "5706c05b-ab36-4ed2-ac86-06146a1bddda" (UID: "5706c05b-ab36-4ed2-ac86-06146a1bddda"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.433361 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c067ef6-5957-4cfd-be96-788f4236d990-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c067ef6-5957-4cfd-be96-788f4236d990" (UID: "7c067ef6-5957-4cfd-be96-788f4236d990"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.434354 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7320d121-c9e6-4af2-ad14-4db89ea38a9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7320d121-c9e6-4af2-ad14-4db89ea38a9e" (UID: "7320d121-c9e6-4af2-ad14-4db89ea38a9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.461856 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z52vp"] Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.469694 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c067ef6-5957-4cfd-be96-788f4236d990-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.470019 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7320d121-c9e6-4af2-ad14-4db89ea38a9e-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.470033 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjt2g\" (UniqueName: \"kubernetes.io/projected/7320d121-c9e6-4af2-ad14-4db89ea38a9e-kube-api-access-pjt2g\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.470048 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj52f\" (UniqueName: \"kubernetes.io/projected/5706c05b-ab36-4ed2-ac86-06146a1bddda-kube-api-access-hj52f\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.470058 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7320d121-c9e6-4af2-ad14-4db89ea38a9e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.470070 4757 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5706c05b-ab36-4ed2-ac86-06146a1bddda-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.470082 4757 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5706c05b-ab36-4ed2-ac86-06146a1bddda-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.949148 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krvn2" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.949164 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krvn2" event={"ID":"17e402cb-44b0-4232-8671-b7db09c8e9b1","Type":"ContainerDied","Data":"cd66641f2e9daf099a6f7366b5d73db4cd88703d3b706c58783053f94e00516a"} Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.949208 4757 scope.go:117] "RemoveContainer" containerID="c87f661a3dd474d83b02f52699d3835e75e1fe68468d2ba07f74f28ea2bc0af7" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.952263 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqtlh" event={"ID":"7320d121-c9e6-4af2-ad14-4db89ea38a9e","Type":"ContainerDied","Data":"b97732ad2a152909e27c0054434d7e3dfdc2f2baf7869704a7bc644c39b762e2"} Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.952285 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqtlh" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.957280 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smj7p" event={"ID":"7c067ef6-5957-4cfd-be96-788f4236d990","Type":"ContainerDied","Data":"1ecbc6fc8f3a2a1b2921688dc9750a6254807f3fab2d5361e9187a1a724db3fb"} Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.957385 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smj7p" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.973549 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.974065 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-crbcx" event={"ID":"5706c05b-ab36-4ed2-ac86-06146a1bddda","Type":"ContainerDied","Data":"b24892c6e32775b26858c732932f8ed21cc532c661b333be8495d5d22ba6987a"} Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.979842 4757 scope.go:117] "RemoveContainer" containerID="3513080f792e82d2d4326182c95754a9b4946c172f9c0cf5cc1e58d2db5985b6" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.983794 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z52vp" event={"ID":"b40bf055-8b99-4c86-9e45-ed2253aa09a1","Type":"ContainerStarted","Data":"fc417b94acd41a9e2a782d66ccdf49930ce169866191fe77bae0f6dc40035ba6"} Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.983862 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z52vp" event={"ID":"b40bf055-8b99-4c86-9e45-ed2253aa09a1","Type":"ContainerStarted","Data":"a962be93f4d40928a18365b10e9e4d0832627a769633bd0cb33877a40fc79e5c"} Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.984801 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-z52vp" Dec 16 12:53:09 crc kubenswrapper[4757]: I1216 12:53:09.987969 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-z52vp" Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.012200 4757 scope.go:117] "RemoveContainer" containerID="a03eb83fed81ef006a1fffecffaa596621fdb3fa92365db48fac2746ba852e87" Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.023342 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-z52vp" podStartSLOduration=2.02332589 podStartE2EDuration="2.02332589s" podCreationTimestamp="2025-12-16 12:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:53:10.005272345 +0000 UTC m=+375.433016151" watchObservedRunningTime="2025-12-16 12:53:10.02332589 +0000 UTC m=+375.451069686" Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.025398 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krvn2"] Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.027941 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-krvn2"] Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.041365 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smj7p"] Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.045503 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-smj7p"] Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.056753 4757 scope.go:117] "RemoveContainer" containerID="b686f8f19ee15982c905ef93dcb240a55e0e5ad12dc8f291031646e7cefdee41" Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.080548 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqtlh"] Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.084439 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vqtlh"] Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.086130 4757 scope.go:117] "RemoveContainer" containerID="f8b4f86cc95a865e67e1e135bd86d1a0ee82445fb640122f00c7a758d8e79522" Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.103487 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-crbcx"] Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.105395 4757 scope.go:117] "RemoveContainer" containerID="5fbdffd3c3bd81970eb94f3fa08ee4c7c65bd5c2d16f9d1d86384bdeeebbce73" Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.112542 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-crbcx"] Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.125925 4757 scope.go:117] "RemoveContainer" containerID="ac640ed7caadf3e8677469672278695a052c560f734ea3d2cf094717a3eef02f" Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.150048 4757 scope.go:117] "RemoveContainer" containerID="3981ce847ea94a6d6649c922a8032fa4852cf2b6710288337a3f4cc057204370" Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.170992 4757 scope.go:117] "RemoveContainer" containerID="c869c9b88fe0ad11355187532c5b80d581a666eaeb3f70aba6bf0d417fe96e5c" Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.188237 4757 scope.go:117] "RemoveContainer" containerID="5a774c11e5e3b64c7da7ce4bf6b50785c9ac0003db27591f60e004ac801bbc95" Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.955460 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17e402cb-44b0-4232-8671-b7db09c8e9b1" path="/var/lib/kubelet/pods/17e402cb-44b0-4232-8671-b7db09c8e9b1/volumes" Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.956505 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5706c05b-ab36-4ed2-ac86-06146a1bddda" path="/var/lib/kubelet/pods/5706c05b-ab36-4ed2-ac86-06146a1bddda/volumes" Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.956928 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7320d121-c9e6-4af2-ad14-4db89ea38a9e" path="/var/lib/kubelet/pods/7320d121-c9e6-4af2-ad14-4db89ea38a9e/volumes" Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.958323 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c067ef6-5957-4cfd-be96-788f4236d990" path="/var/lib/kubelet/pods/7c067ef6-5957-4cfd-be96-788f4236d990/volumes" Dec 16 12:53:10 crc kubenswrapper[4757]: I1216 12:53:10.958946 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8ab79c2-762d-4773-ae6e-6e92acdf4508" path="/var/lib/kubelet/pods/c8ab79c2-762d-4773-ae6e-6e92acdf4508/volumes" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.902112 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tw282"] Dec 16 12:53:18 crc kubenswrapper[4757]: E1216 12:53:18.903239 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7320d121-c9e6-4af2-ad14-4db89ea38a9e" containerName="extract-content" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.903277 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="7320d121-c9e6-4af2-ad14-4db89ea38a9e" containerName="extract-content" Dec 16 12:53:18 crc kubenswrapper[4757]: E1216 12:53:18.903289 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c067ef6-5957-4cfd-be96-788f4236d990" containerName="registry-server" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.903296 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c067ef6-5957-4cfd-be96-788f4236d990" containerName="registry-server" Dec 16 12:53:18 crc kubenswrapper[4757]: E1216 12:53:18.903313 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e402cb-44b0-4232-8671-b7db09c8e9b1" containerName="extract-utilities" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.903321 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e402cb-44b0-4232-8671-b7db09c8e9b1" containerName="extract-utilities" Dec 16 12:53:18 crc kubenswrapper[4757]: E1216 12:53:18.903360 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e402cb-44b0-4232-8671-b7db09c8e9b1" containerName="registry-server" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.903369 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e402cb-44b0-4232-8671-b7db09c8e9b1" containerName="registry-server" Dec 16 12:53:18 crc kubenswrapper[4757]: E1216 12:53:18.903387 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ab79c2-762d-4773-ae6e-6e92acdf4508" containerName="registry-server" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.903394 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ab79c2-762d-4773-ae6e-6e92acdf4508" containerName="registry-server" Dec 16 12:53:18 crc kubenswrapper[4757]: E1216 12:53:18.903403 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7320d121-c9e6-4af2-ad14-4db89ea38a9e" containerName="extract-utilities" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.903415 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="7320d121-c9e6-4af2-ad14-4db89ea38a9e" containerName="extract-utilities" Dec 16 12:53:18 crc kubenswrapper[4757]: E1216 12:53:18.903448 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c067ef6-5957-4cfd-be96-788f4236d990" containerName="extract-utilities" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.903459 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c067ef6-5957-4cfd-be96-788f4236d990" containerName="extract-utilities" Dec 16 12:53:18 crc kubenswrapper[4757]: E1216 12:53:18.903468 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5706c05b-ab36-4ed2-ac86-06146a1bddda" containerName="marketplace-operator" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.903475 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="5706c05b-ab36-4ed2-ac86-06146a1bddda" containerName="marketplace-operator" Dec 16 12:53:18 crc kubenswrapper[4757]: E1216 12:53:18.903492 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ab79c2-762d-4773-ae6e-6e92acdf4508" containerName="extract-utilities" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.903500 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ab79c2-762d-4773-ae6e-6e92acdf4508" containerName="extract-utilities" Dec 16 12:53:18 crc kubenswrapper[4757]: E1216 12:53:18.903536 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e402cb-44b0-4232-8671-b7db09c8e9b1" containerName="extract-content" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.903544 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e402cb-44b0-4232-8671-b7db09c8e9b1" containerName="extract-content" Dec 16 12:53:18 crc kubenswrapper[4757]: E1216 12:53:18.903555 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ab79c2-762d-4773-ae6e-6e92acdf4508" containerName="extract-content" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.903563 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ab79c2-762d-4773-ae6e-6e92acdf4508" containerName="extract-content" Dec 16 12:53:18 crc kubenswrapper[4757]: E1216 12:53:18.903576 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c067ef6-5957-4cfd-be96-788f4236d990" containerName="extract-content" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.903583 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c067ef6-5957-4cfd-be96-788f4236d990" containerName="extract-content" Dec 16 12:53:18 crc kubenswrapper[4757]: E1216 12:53:18.903623 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7320d121-c9e6-4af2-ad14-4db89ea38a9e" containerName="registry-server" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.903632 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="7320d121-c9e6-4af2-ad14-4db89ea38a9e" containerName="registry-server" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.903981 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="5706c05b-ab36-4ed2-ac86-06146a1bddda" containerName="marketplace-operator" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.904032 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c067ef6-5957-4cfd-be96-788f4236d990" containerName="registry-server" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.904055 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="5706c05b-ab36-4ed2-ac86-06146a1bddda" containerName="marketplace-operator" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.904067 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ab79c2-762d-4773-ae6e-6e92acdf4508" containerName="registry-server" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.904086 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="7320d121-c9e6-4af2-ad14-4db89ea38a9e" containerName="registry-server" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.904131 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e402cb-44b0-4232-8671-b7db09c8e9b1" containerName="registry-server" Dec 16 12:53:18 crc kubenswrapper[4757]: E1216 12:53:18.904501 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5706c05b-ab36-4ed2-ac86-06146a1bddda" containerName="marketplace-operator" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.904516 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="5706c05b-ab36-4ed2-ac86-06146a1bddda" containerName="marketplace-operator" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.906454 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tw282"] Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.906579 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tw282" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.917356 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.997265 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52fab5a3-5b60-47e2-a517-37c8d9adc3c1-utilities\") pod \"redhat-marketplace-tw282\" (UID: \"52fab5a3-5b60-47e2-a517-37c8d9adc3c1\") " pod="openshift-marketplace/redhat-marketplace-tw282" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.997348 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52fab5a3-5b60-47e2-a517-37c8d9adc3c1-catalog-content\") pod \"redhat-marketplace-tw282\" (UID: \"52fab5a3-5b60-47e2-a517-37c8d9adc3c1\") " pod="openshift-marketplace/redhat-marketplace-tw282" Dec 16 12:53:18 crc kubenswrapper[4757]: I1216 12:53:18.997382 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm59w\" (UniqueName: \"kubernetes.io/projected/52fab5a3-5b60-47e2-a517-37c8d9adc3c1-kube-api-access-jm59w\") pod \"redhat-marketplace-tw282\" (UID: \"52fab5a3-5b60-47e2-a517-37c8d9adc3c1\") " pod="openshift-marketplace/redhat-marketplace-tw282" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.085278 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cdz7w"] Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.086168 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdz7w" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.088659 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.097991 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cdz7w"] Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.098055 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52fab5a3-5b60-47e2-a517-37c8d9adc3c1-utilities\") pod \"redhat-marketplace-tw282\" (UID: \"52fab5a3-5b60-47e2-a517-37c8d9adc3c1\") " pod="openshift-marketplace/redhat-marketplace-tw282" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.098213 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52fab5a3-5b60-47e2-a517-37c8d9adc3c1-catalog-content\") pod \"redhat-marketplace-tw282\" (UID: \"52fab5a3-5b60-47e2-a517-37c8d9adc3c1\") " pod="openshift-marketplace/redhat-marketplace-tw282" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.098253 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm59w\" (UniqueName: \"kubernetes.io/projected/52fab5a3-5b60-47e2-a517-37c8d9adc3c1-kube-api-access-jm59w\") pod \"redhat-marketplace-tw282\" (UID: \"52fab5a3-5b60-47e2-a517-37c8d9adc3c1\") " pod="openshift-marketplace/redhat-marketplace-tw282" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.098671 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52fab5a3-5b60-47e2-a517-37c8d9adc3c1-utilities\") pod \"redhat-marketplace-tw282\" (UID: \"52fab5a3-5b60-47e2-a517-37c8d9adc3c1\") " pod="openshift-marketplace/redhat-marketplace-tw282" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.098955 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52fab5a3-5b60-47e2-a517-37c8d9adc3c1-catalog-content\") pod \"redhat-marketplace-tw282\" (UID: \"52fab5a3-5b60-47e2-a517-37c8d9adc3c1\") " pod="openshift-marketplace/redhat-marketplace-tw282" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.122066 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm59w\" (UniqueName: \"kubernetes.io/projected/52fab5a3-5b60-47e2-a517-37c8d9adc3c1-kube-api-access-jm59w\") pod \"redhat-marketplace-tw282\" (UID: \"52fab5a3-5b60-47e2-a517-37c8d9adc3c1\") " pod="openshift-marketplace/redhat-marketplace-tw282" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.200210 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7eba443-b255-4c4b-8aad-fc891c2a8a39-utilities\") pod \"redhat-operators-cdz7w\" (UID: \"e7eba443-b255-4c4b-8aad-fc891c2a8a39\") " pod="openshift-marketplace/redhat-operators-cdz7w" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.200288 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7eba443-b255-4c4b-8aad-fc891c2a8a39-catalog-content\") pod \"redhat-operators-cdz7w\" (UID: \"e7eba443-b255-4c4b-8aad-fc891c2a8a39\") " pod="openshift-marketplace/redhat-operators-cdz7w" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.200342 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8lrs\" (UniqueName: \"kubernetes.io/projected/e7eba443-b255-4c4b-8aad-fc891c2a8a39-kube-api-access-d8lrs\") pod \"redhat-operators-cdz7w\" (UID: \"e7eba443-b255-4c4b-8aad-fc891c2a8a39\") " pod="openshift-marketplace/redhat-operators-cdz7w" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.301273 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7eba443-b255-4c4b-8aad-fc891c2a8a39-utilities\") pod \"redhat-operators-cdz7w\" (UID: \"e7eba443-b255-4c4b-8aad-fc891c2a8a39\") " pod="openshift-marketplace/redhat-operators-cdz7w" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.301401 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7eba443-b255-4c4b-8aad-fc891c2a8a39-catalog-content\") pod \"redhat-operators-cdz7w\" (UID: \"e7eba443-b255-4c4b-8aad-fc891c2a8a39\") " pod="openshift-marketplace/redhat-operators-cdz7w" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.301475 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8lrs\" (UniqueName: \"kubernetes.io/projected/e7eba443-b255-4c4b-8aad-fc891c2a8a39-kube-api-access-d8lrs\") pod \"redhat-operators-cdz7w\" (UID: \"e7eba443-b255-4c4b-8aad-fc891c2a8a39\") " pod="openshift-marketplace/redhat-operators-cdz7w" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.301881 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7eba443-b255-4c4b-8aad-fc891c2a8a39-utilities\") pod \"redhat-operators-cdz7w\" (UID: \"e7eba443-b255-4c4b-8aad-fc891c2a8a39\") " pod="openshift-marketplace/redhat-operators-cdz7w" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.301917 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7eba443-b255-4c4b-8aad-fc891c2a8a39-catalog-content\") pod \"redhat-operators-cdz7w\" (UID: \"e7eba443-b255-4c4b-8aad-fc891c2a8a39\") " pod="openshift-marketplace/redhat-operators-cdz7w" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.320637 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8lrs\" (UniqueName: \"kubernetes.io/projected/e7eba443-b255-4c4b-8aad-fc891c2a8a39-kube-api-access-d8lrs\") pod \"redhat-operators-cdz7w\" (UID: \"e7eba443-b255-4c4b-8aad-fc891c2a8a39\") " pod="openshift-marketplace/redhat-operators-cdz7w" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.706871 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tw282" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.716542 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdz7w" Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.754993 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm"] Dec 16 12:53:19 crc kubenswrapper[4757]: I1216 12:53:19.755239 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" podUID="ec969be3-3091-446a-9638-024ed57e190d" containerName="controller-manager" containerID="cri-o://f01579e1381f8df4c0824c3778636e27132f1f181531cbc67f2ff8e71d696cfc" gracePeriod=30 Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.041053 4757 generic.go:334] "Generic (PLEG): container finished" podID="ec969be3-3091-446a-9638-024ed57e190d" containerID="f01579e1381f8df4c0824c3778636e27132f1f181531cbc67f2ff8e71d696cfc" exitCode=0 Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.041129 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" event={"ID":"ec969be3-3091-446a-9638-024ed57e190d","Type":"ContainerDied","Data":"f01579e1381f8df4c0824c3778636e27132f1f181531cbc67f2ff8e71d696cfc"} Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.208388 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.232151 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tw282"] Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.313387 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec969be3-3091-446a-9638-024ed57e190d-proxy-ca-bundles\") pod \"ec969be3-3091-446a-9638-024ed57e190d\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.313848 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csvpc\" (UniqueName: \"kubernetes.io/projected/ec969be3-3091-446a-9638-024ed57e190d-kube-api-access-csvpc\") pod \"ec969be3-3091-446a-9638-024ed57e190d\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.313889 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec969be3-3091-446a-9638-024ed57e190d-client-ca\") pod \"ec969be3-3091-446a-9638-024ed57e190d\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.313905 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec969be3-3091-446a-9638-024ed57e190d-config\") pod \"ec969be3-3091-446a-9638-024ed57e190d\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.313946 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec969be3-3091-446a-9638-024ed57e190d-serving-cert\") pod \"ec969be3-3091-446a-9638-024ed57e190d\" (UID: \"ec969be3-3091-446a-9638-024ed57e190d\") " Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.314637 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec969be3-3091-446a-9638-024ed57e190d-client-ca" (OuterVolumeSpecName: "client-ca") pod "ec969be3-3091-446a-9638-024ed57e190d" (UID: "ec969be3-3091-446a-9638-024ed57e190d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.314896 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec969be3-3091-446a-9638-024ed57e190d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ec969be3-3091-446a-9638-024ed57e190d" (UID: "ec969be3-3091-446a-9638-024ed57e190d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.314931 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec969be3-3091-446a-9638-024ed57e190d-config" (OuterVolumeSpecName: "config") pod "ec969be3-3091-446a-9638-024ed57e190d" (UID: "ec969be3-3091-446a-9638-024ed57e190d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.320222 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec969be3-3091-446a-9638-024ed57e190d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ec969be3-3091-446a-9638-024ed57e190d" (UID: "ec969be3-3091-446a-9638-024ed57e190d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.321146 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec969be3-3091-446a-9638-024ed57e190d-kube-api-access-csvpc" (OuterVolumeSpecName: "kube-api-access-csvpc") pod "ec969be3-3091-446a-9638-024ed57e190d" (UID: "ec969be3-3091-446a-9638-024ed57e190d"). InnerVolumeSpecName "kube-api-access-csvpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.331917 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cdz7w"] Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.415722 4757 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec969be3-3091-446a-9638-024ed57e190d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.415751 4757 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec969be3-3091-446a-9638-024ed57e190d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.415780 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csvpc\" (UniqueName: \"kubernetes.io/projected/ec969be3-3091-446a-9638-024ed57e190d-kube-api-access-csvpc\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.415792 4757 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec969be3-3091-446a-9638-024ed57e190d-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.415802 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec969be3-3091-446a-9638-024ed57e190d-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:20 crc kubenswrapper[4757]: I1216 12:53:20.985934 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-99bc8f764-xj7tq"] Dec 16 12:53:21 crc kubenswrapper[4757]: E1216 12:53:21.014713 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec969be3-3091-446a-9638-024ed57e190d" containerName="controller-manager" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.014752 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec969be3-3091-446a-9638-024ed57e190d" containerName="controller-manager" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.016208 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec969be3-3091-446a-9638-024ed57e190d" containerName="controller-manager" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.016784 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-99bc8f764-xj7tq"] Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.016943 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.047379 4757 generic.go:334] "Generic (PLEG): container finished" podID="e7eba443-b255-4c4b-8aad-fc891c2a8a39" containerID="80f763398c0908391d7685f4eee18bf4ccc3b347c169906cb37b1744880917a4" exitCode=0 Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.048191 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdz7w" event={"ID":"e7eba443-b255-4c4b-8aad-fc891c2a8a39","Type":"ContainerDied","Data":"80f763398c0908391d7685f4eee18bf4ccc3b347c169906cb37b1744880917a4"} Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.048315 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdz7w" event={"ID":"e7eba443-b255-4c4b-8aad-fc891c2a8a39","Type":"ContainerStarted","Data":"7837af01ad225ffbad5445991bc829c67dc80fe5513fd218ee281784e0d431da"} Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.054255 4757 generic.go:334] "Generic (PLEG): container finished" podID="52fab5a3-5b60-47e2-a517-37c8d9adc3c1" containerID="b6051ad74b5c8f99cd335ab39995d1371b36d681f339a702b950a57fd07149c4" exitCode=0 Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.054519 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tw282" event={"ID":"52fab5a3-5b60-47e2-a517-37c8d9adc3c1","Type":"ContainerDied","Data":"b6051ad74b5c8f99cd335ab39995d1371b36d681f339a702b950a57fd07149c4"} Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.055507 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tw282" event={"ID":"52fab5a3-5b60-47e2-a517-37c8d9adc3c1","Type":"ContainerStarted","Data":"0d434e2cc805e50aeaac6165f1b71f00065c96be1daa81ad41803f51a92c5a8a"} Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.057811 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" event={"ID":"ec969be3-3091-446a-9638-024ed57e190d","Type":"ContainerDied","Data":"b0b00e35ade3521d52fad9aec645ac6477a5d71c810a248a2a4e3725a09b4c3c"} Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.057988 4757 scope.go:117] "RemoveContainer" containerID="f01579e1381f8df4c0824c3778636e27132f1f181531cbc67f2ff8e71d696cfc" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.057962 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.116044 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm"] Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.119481 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6fdb9d5c89-sbbrm"] Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.123251 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4dd8b508-f2e1-47f8-96d4-26e8a3389098-client-ca\") pod \"controller-manager-99bc8f764-xj7tq\" (UID: \"4dd8b508-f2e1-47f8-96d4-26e8a3389098\") " pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.123314 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dd8b508-f2e1-47f8-96d4-26e8a3389098-config\") pod \"controller-manager-99bc8f764-xj7tq\" (UID: \"4dd8b508-f2e1-47f8-96d4-26e8a3389098\") " pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.123446 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkz6n\" (UniqueName: \"kubernetes.io/projected/4dd8b508-f2e1-47f8-96d4-26e8a3389098-kube-api-access-nkz6n\") pod \"controller-manager-99bc8f764-xj7tq\" (UID: \"4dd8b508-f2e1-47f8-96d4-26e8a3389098\") " pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.123568 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dd8b508-f2e1-47f8-96d4-26e8a3389098-serving-cert\") pod \"controller-manager-99bc8f764-xj7tq\" (UID: \"4dd8b508-f2e1-47f8-96d4-26e8a3389098\") " pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.123616 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4dd8b508-f2e1-47f8-96d4-26e8a3389098-proxy-ca-bundles\") pod \"controller-manager-99bc8f764-xj7tq\" (UID: \"4dd8b508-f2e1-47f8-96d4-26e8a3389098\") " pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.181412 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.181466 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.181507 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.182017 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"caa3a93ad3bd3927512be6975f6d9bbe16d0438123c8248da65c133894f8be8b"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.182082 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://caa3a93ad3bd3927512be6975f6d9bbe16d0438123c8248da65c133894f8be8b" gracePeriod=600 Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.225646 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4dd8b508-f2e1-47f8-96d4-26e8a3389098-client-ca\") pod \"controller-manager-99bc8f764-xj7tq\" (UID: \"4dd8b508-f2e1-47f8-96d4-26e8a3389098\") " pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.226352 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dd8b508-f2e1-47f8-96d4-26e8a3389098-config\") pod \"controller-manager-99bc8f764-xj7tq\" (UID: \"4dd8b508-f2e1-47f8-96d4-26e8a3389098\") " pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.226378 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkz6n\" (UniqueName: \"kubernetes.io/projected/4dd8b508-f2e1-47f8-96d4-26e8a3389098-kube-api-access-nkz6n\") pod \"controller-manager-99bc8f764-xj7tq\" (UID: \"4dd8b508-f2e1-47f8-96d4-26e8a3389098\") " pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.226411 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dd8b508-f2e1-47f8-96d4-26e8a3389098-serving-cert\") pod \"controller-manager-99bc8f764-xj7tq\" (UID: \"4dd8b508-f2e1-47f8-96d4-26e8a3389098\") " pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.226445 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4dd8b508-f2e1-47f8-96d4-26e8a3389098-proxy-ca-bundles\") pod \"controller-manager-99bc8f764-xj7tq\" (UID: \"4dd8b508-f2e1-47f8-96d4-26e8a3389098\") " pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.227413 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4dd8b508-f2e1-47f8-96d4-26e8a3389098-proxy-ca-bundles\") pod \"controller-manager-99bc8f764-xj7tq\" (UID: \"4dd8b508-f2e1-47f8-96d4-26e8a3389098\") " pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.227629 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dd8b508-f2e1-47f8-96d4-26e8a3389098-config\") pod \"controller-manager-99bc8f764-xj7tq\" (UID: \"4dd8b508-f2e1-47f8-96d4-26e8a3389098\") " pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.228163 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4dd8b508-f2e1-47f8-96d4-26e8a3389098-client-ca\") pod \"controller-manager-99bc8f764-xj7tq\" (UID: \"4dd8b508-f2e1-47f8-96d4-26e8a3389098\") " pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.235870 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dd8b508-f2e1-47f8-96d4-26e8a3389098-serving-cert\") pod \"controller-manager-99bc8f764-xj7tq\" (UID: \"4dd8b508-f2e1-47f8-96d4-26e8a3389098\") " pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.249578 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkz6n\" (UniqueName: \"kubernetes.io/projected/4dd8b508-f2e1-47f8-96d4-26e8a3389098-kube-api-access-nkz6n\") pod \"controller-manager-99bc8f764-xj7tq\" (UID: \"4dd8b508-f2e1-47f8-96d4-26e8a3389098\") " pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.287135 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9fpvh"] Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.288349 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fpvh" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.293785 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.298460 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9fpvh"] Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.326981 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mw4z\" (UniqueName: \"kubernetes.io/projected/87336683-d0ae-4df9-91a8-881fa54e49b9-kube-api-access-8mw4z\") pod \"community-operators-9fpvh\" (UID: \"87336683-d0ae-4df9-91a8-881fa54e49b9\") " pod="openshift-marketplace/community-operators-9fpvh" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.327061 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87336683-d0ae-4df9-91a8-881fa54e49b9-catalog-content\") pod \"community-operators-9fpvh\" (UID: \"87336683-d0ae-4df9-91a8-881fa54e49b9\") " pod="openshift-marketplace/community-operators-9fpvh" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.327084 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87336683-d0ae-4df9-91a8-881fa54e49b9-utilities\") pod \"community-operators-9fpvh\" (UID: \"87336683-d0ae-4df9-91a8-881fa54e49b9\") " pod="openshift-marketplace/community-operators-9fpvh" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.341388 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.427838 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mw4z\" (UniqueName: \"kubernetes.io/projected/87336683-d0ae-4df9-91a8-881fa54e49b9-kube-api-access-8mw4z\") pod \"community-operators-9fpvh\" (UID: \"87336683-d0ae-4df9-91a8-881fa54e49b9\") " pod="openshift-marketplace/community-operators-9fpvh" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.427955 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87336683-d0ae-4df9-91a8-881fa54e49b9-catalog-content\") pod \"community-operators-9fpvh\" (UID: \"87336683-d0ae-4df9-91a8-881fa54e49b9\") " pod="openshift-marketplace/community-operators-9fpvh" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.427981 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87336683-d0ae-4df9-91a8-881fa54e49b9-utilities\") pod \"community-operators-9fpvh\" (UID: \"87336683-d0ae-4df9-91a8-881fa54e49b9\") " pod="openshift-marketplace/community-operators-9fpvh" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.429330 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87336683-d0ae-4df9-91a8-881fa54e49b9-utilities\") pod \"community-operators-9fpvh\" (UID: \"87336683-d0ae-4df9-91a8-881fa54e49b9\") " pod="openshift-marketplace/community-operators-9fpvh" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.430478 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87336683-d0ae-4df9-91a8-881fa54e49b9-catalog-content\") pod \"community-operators-9fpvh\" (UID: \"87336683-d0ae-4df9-91a8-881fa54e49b9\") " pod="openshift-marketplace/community-operators-9fpvh" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.459086 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mw4z\" (UniqueName: \"kubernetes.io/projected/87336683-d0ae-4df9-91a8-881fa54e49b9-kube-api-access-8mw4z\") pod \"community-operators-9fpvh\" (UID: \"87336683-d0ae-4df9-91a8-881fa54e49b9\") " pod="openshift-marketplace/community-operators-9fpvh" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.487518 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vnn8x"] Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.494874 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnn8x" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.505245 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.509999 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vnn8x"] Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.630417 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91c99d1e-682b-4fc3-a4de-594f16bfb4d7-catalog-content\") pod \"certified-operators-vnn8x\" (UID: \"91c99d1e-682b-4fc3-a4de-594f16bfb4d7\") " pod="openshift-marketplace/certified-operators-vnn8x" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.630479 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmn2g\" (UniqueName: \"kubernetes.io/projected/91c99d1e-682b-4fc3-a4de-594f16bfb4d7-kube-api-access-xmn2g\") pod \"certified-operators-vnn8x\" (UID: \"91c99d1e-682b-4fc3-a4de-594f16bfb4d7\") " pod="openshift-marketplace/certified-operators-vnn8x" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.630603 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91c99d1e-682b-4fc3-a4de-594f16bfb4d7-utilities\") pod \"certified-operators-vnn8x\" (UID: \"91c99d1e-682b-4fc3-a4de-594f16bfb4d7\") " pod="openshift-marketplace/certified-operators-vnn8x" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.635109 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fpvh" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.734730 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91c99d1e-682b-4fc3-a4de-594f16bfb4d7-utilities\") pod \"certified-operators-vnn8x\" (UID: \"91c99d1e-682b-4fc3-a4de-594f16bfb4d7\") " pod="openshift-marketplace/certified-operators-vnn8x" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.735131 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91c99d1e-682b-4fc3-a4de-594f16bfb4d7-catalog-content\") pod \"certified-operators-vnn8x\" (UID: \"91c99d1e-682b-4fc3-a4de-594f16bfb4d7\") " pod="openshift-marketplace/certified-operators-vnn8x" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.735172 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmn2g\" (UniqueName: \"kubernetes.io/projected/91c99d1e-682b-4fc3-a4de-594f16bfb4d7-kube-api-access-xmn2g\") pod \"certified-operators-vnn8x\" (UID: \"91c99d1e-682b-4fc3-a4de-594f16bfb4d7\") " pod="openshift-marketplace/certified-operators-vnn8x" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.736543 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91c99d1e-682b-4fc3-a4de-594f16bfb4d7-utilities\") pod \"certified-operators-vnn8x\" (UID: \"91c99d1e-682b-4fc3-a4de-594f16bfb4d7\") " pod="openshift-marketplace/certified-operators-vnn8x" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.741619 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91c99d1e-682b-4fc3-a4de-594f16bfb4d7-catalog-content\") pod \"certified-operators-vnn8x\" (UID: \"91c99d1e-682b-4fc3-a4de-594f16bfb4d7\") " pod="openshift-marketplace/certified-operators-vnn8x" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.758239 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmn2g\" (UniqueName: \"kubernetes.io/projected/91c99d1e-682b-4fc3-a4de-594f16bfb4d7-kube-api-access-xmn2g\") pod \"certified-operators-vnn8x\" (UID: \"91c99d1e-682b-4fc3-a4de-594f16bfb4d7\") " pod="openshift-marketplace/certified-operators-vnn8x" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.815271 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-99bc8f764-xj7tq"] Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.821951 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnn8x" Dec 16 12:53:21 crc kubenswrapper[4757]: I1216 12:53:21.927390 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9fpvh"] Dec 16 12:53:22 crc kubenswrapper[4757]: I1216 12:53:22.075290 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" event={"ID":"4dd8b508-f2e1-47f8-96d4-26e8a3389098","Type":"ContainerStarted","Data":"eabe6f35b0610284c5764af811d5ac58ab72116324e21a5ed296793bdbbe527c"} Dec 16 12:53:22 crc kubenswrapper[4757]: I1216 12:53:22.077875 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="caa3a93ad3bd3927512be6975f6d9bbe16d0438123c8248da65c133894f8be8b" exitCode=0 Dec 16 12:53:22 crc kubenswrapper[4757]: I1216 12:53:22.077946 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"caa3a93ad3bd3927512be6975f6d9bbe16d0438123c8248da65c133894f8be8b"} Dec 16 12:53:22 crc kubenswrapper[4757]: I1216 12:53:22.077972 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"06f2518ad142487e4376906bd25ad7deb58807a073968183d666cf3c0c45d958"} Dec 16 12:53:22 crc kubenswrapper[4757]: I1216 12:53:22.077991 4757 scope.go:117] "RemoveContainer" containerID="2f40d89307e9a28e0c5bc75210516666e835f447d8240b6dc39fca9f0a2f99d3" Dec 16 12:53:22 crc kubenswrapper[4757]: I1216 12:53:22.082793 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdz7w" event={"ID":"e7eba443-b255-4c4b-8aad-fc891c2a8a39","Type":"ContainerStarted","Data":"81899b714702aa11776940648132997a2db202a16a8a076f2bf7302a4e40f1c2"} Dec 16 12:53:22 crc kubenswrapper[4757]: I1216 12:53:22.084475 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fpvh" event={"ID":"87336683-d0ae-4df9-91a8-881fa54e49b9","Type":"ContainerStarted","Data":"bc7b0b6de11d607ebf268ab7ef07590d4e4096ee6b5216f60fafd61e1842490d"} Dec 16 12:53:22 crc kubenswrapper[4757]: I1216 12:53:22.277947 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vnn8x"] Dec 16 12:53:22 crc kubenswrapper[4757]: I1216 12:53:22.958930 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec969be3-3091-446a-9638-024ed57e190d" path="/var/lib/kubelet/pods/ec969be3-3091-446a-9638-024ed57e190d/volumes" Dec 16 12:53:23 crc kubenswrapper[4757]: I1216 12:53:23.091653 4757 generic.go:334] "Generic (PLEG): container finished" podID="e7eba443-b255-4c4b-8aad-fc891c2a8a39" containerID="81899b714702aa11776940648132997a2db202a16a8a076f2bf7302a4e40f1c2" exitCode=0 Dec 16 12:53:23 crc kubenswrapper[4757]: I1216 12:53:23.092916 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdz7w" event={"ID":"e7eba443-b255-4c4b-8aad-fc891c2a8a39","Type":"ContainerDied","Data":"81899b714702aa11776940648132997a2db202a16a8a076f2bf7302a4e40f1c2"} Dec 16 12:53:23 crc kubenswrapper[4757]: I1216 12:53:23.101227 4757 generic.go:334] "Generic (PLEG): container finished" podID="91c99d1e-682b-4fc3-a4de-594f16bfb4d7" containerID="b22d7106ba99a6622d024fbe43759dea9185d9ba8bfce484e02e072c99753409" exitCode=0 Dec 16 12:53:23 crc kubenswrapper[4757]: I1216 12:53:23.102018 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnn8x" event={"ID":"91c99d1e-682b-4fc3-a4de-594f16bfb4d7","Type":"ContainerDied","Data":"b22d7106ba99a6622d024fbe43759dea9185d9ba8bfce484e02e072c99753409"} Dec 16 12:53:23 crc kubenswrapper[4757]: I1216 12:53:23.102049 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnn8x" event={"ID":"91c99d1e-682b-4fc3-a4de-594f16bfb4d7","Type":"ContainerStarted","Data":"41a272018f1d538a88b13f79888f9972c7b396270835fbbc3feab0c0fa2176ce"} Dec 16 12:53:23 crc kubenswrapper[4757]: I1216 12:53:23.108111 4757 generic.go:334] "Generic (PLEG): container finished" podID="87336683-d0ae-4df9-91a8-881fa54e49b9" containerID="b8349c80ec695fc72e489cd4a4921f578360218a6befa1b991b8cf920983a6fa" exitCode=0 Dec 16 12:53:23 crc kubenswrapper[4757]: I1216 12:53:23.108195 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fpvh" event={"ID":"87336683-d0ae-4df9-91a8-881fa54e49b9","Type":"ContainerDied","Data":"b8349c80ec695fc72e489cd4a4921f578360218a6befa1b991b8cf920983a6fa"} Dec 16 12:53:23 crc kubenswrapper[4757]: I1216 12:53:23.120904 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" event={"ID":"4dd8b508-f2e1-47f8-96d4-26e8a3389098","Type":"ContainerStarted","Data":"d1c875476eaec0b7b6a2f1223660287bff6e21addd006f68690b39362b58335e"} Dec 16 12:53:23 crc kubenswrapper[4757]: I1216 12:53:23.121661 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:23 crc kubenswrapper[4757]: I1216 12:53:23.125684 4757 generic.go:334] "Generic (PLEG): container finished" podID="52fab5a3-5b60-47e2-a517-37c8d9adc3c1" containerID="69d140c889c0a0611c6c76141b34a02c4f7b1fa8bf12eb59249ad9eea81cfed3" exitCode=0 Dec 16 12:53:23 crc kubenswrapper[4757]: I1216 12:53:23.125776 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tw282" event={"ID":"52fab5a3-5b60-47e2-a517-37c8d9adc3c1","Type":"ContainerDied","Data":"69d140c889c0a0611c6c76141b34a02c4f7b1fa8bf12eb59249ad9eea81cfed3"} Dec 16 12:53:23 crc kubenswrapper[4757]: I1216 12:53:23.128287 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" Dec 16 12:53:23 crc kubenswrapper[4757]: I1216 12:53:23.184229 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-99bc8f764-xj7tq" podStartSLOduration=4.18420765 podStartE2EDuration="4.18420765s" podCreationTimestamp="2025-12-16 12:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:53:23.176092428 +0000 UTC m=+388.603836234" watchObservedRunningTime="2025-12-16 12:53:23.18420765 +0000 UTC m=+388.611951446" Dec 16 12:53:24 crc kubenswrapper[4757]: I1216 12:53:24.159408 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdz7w" event={"ID":"e7eba443-b255-4c4b-8aad-fc891c2a8a39","Type":"ContainerStarted","Data":"a677d00c0ef788983f4fd96b5a392038dcb5e103ff7be25ce29fb57f06ec940e"} Dec 16 12:53:24 crc kubenswrapper[4757]: I1216 12:53:24.161835 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnn8x" event={"ID":"91c99d1e-682b-4fc3-a4de-594f16bfb4d7","Type":"ContainerStarted","Data":"2d666b775059ba9cd19bd5ceb86cd3afe3066d886453eab41949fcc0d57085fa"} Dec 16 12:53:24 crc kubenswrapper[4757]: I1216 12:53:24.164781 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fpvh" event={"ID":"87336683-d0ae-4df9-91a8-881fa54e49b9","Type":"ContainerStarted","Data":"740632dcb277e2e5da10b905b8902d72763786907dec291d3586ad6cf7db2ceb"} Dec 16 12:53:24 crc kubenswrapper[4757]: I1216 12:53:24.170794 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tw282" event={"ID":"52fab5a3-5b60-47e2-a517-37c8d9adc3c1","Type":"ContainerStarted","Data":"3241466091d0fcc4851bfcafbaed049f1820669bcfb873cc67808a9bb19d0f58"} Dec 16 12:53:24 crc kubenswrapper[4757]: I1216 12:53:24.179642 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cdz7w" podStartSLOduration=2.548367155 podStartE2EDuration="5.179623629s" podCreationTimestamp="2025-12-16 12:53:19 +0000 UTC" firstStartedPulling="2025-12-16 12:53:21.0501819 +0000 UTC m=+386.477925696" lastFinishedPulling="2025-12-16 12:53:23.681438374 +0000 UTC m=+389.109182170" observedRunningTime="2025-12-16 12:53:24.178431894 +0000 UTC m=+389.606175690" watchObservedRunningTime="2025-12-16 12:53:24.179623629 +0000 UTC m=+389.607367425" Dec 16 12:53:24 crc kubenswrapper[4757]: I1216 12:53:24.241725 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tw282" podStartSLOduration=3.608139589 podStartE2EDuration="6.241710303s" podCreationTimestamp="2025-12-16 12:53:18 +0000 UTC" firstStartedPulling="2025-12-16 12:53:21.055657447 +0000 UTC m=+386.483401243" lastFinishedPulling="2025-12-16 12:53:23.689228161 +0000 UTC m=+389.116971957" observedRunningTime="2025-12-16 12:53:24.240083287 +0000 UTC m=+389.667827093" watchObservedRunningTime="2025-12-16 12:53:24.241710303 +0000 UTC m=+389.669454099" Dec 16 12:53:25 crc kubenswrapper[4757]: I1216 12:53:25.177374 4757 generic.go:334] "Generic (PLEG): container finished" podID="91c99d1e-682b-4fc3-a4de-594f16bfb4d7" containerID="2d666b775059ba9cd19bd5ceb86cd3afe3066d886453eab41949fcc0d57085fa" exitCode=0 Dec 16 12:53:25 crc kubenswrapper[4757]: I1216 12:53:25.177463 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnn8x" event={"ID":"91c99d1e-682b-4fc3-a4de-594f16bfb4d7","Type":"ContainerDied","Data":"2d666b775059ba9cd19bd5ceb86cd3afe3066d886453eab41949fcc0d57085fa"} Dec 16 12:53:25 crc kubenswrapper[4757]: I1216 12:53:25.180770 4757 generic.go:334] "Generic (PLEG): container finished" podID="87336683-d0ae-4df9-91a8-881fa54e49b9" containerID="740632dcb277e2e5da10b905b8902d72763786907dec291d3586ad6cf7db2ceb" exitCode=0 Dec 16 12:53:25 crc kubenswrapper[4757]: I1216 12:53:25.180925 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fpvh" event={"ID":"87336683-d0ae-4df9-91a8-881fa54e49b9","Type":"ContainerDied","Data":"740632dcb277e2e5da10b905b8902d72763786907dec291d3586ad6cf7db2ceb"} Dec 16 12:53:25 crc kubenswrapper[4757]: I1216 12:53:25.948880 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-4zph4" Dec 16 12:53:26 crc kubenswrapper[4757]: I1216 12:53:26.015598 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ws9qr"] Dec 16 12:53:26 crc kubenswrapper[4757]: I1216 12:53:26.188505 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fpvh" event={"ID":"87336683-d0ae-4df9-91a8-881fa54e49b9","Type":"ContainerStarted","Data":"2e778dc740402855c9ead7106abee1e5788505f5e4d24e687fdca6e391ae11a9"} Dec 16 12:53:26 crc kubenswrapper[4757]: I1216 12:53:26.191688 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnn8x" event={"ID":"91c99d1e-682b-4fc3-a4de-594f16bfb4d7","Type":"ContainerStarted","Data":"70cdbf10b61388fca69a257dea9103791ac46be943071ccfeeb3053a0c83c22b"} Dec 16 12:53:26 crc kubenswrapper[4757]: I1216 12:53:26.208740 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9fpvh" podStartSLOduration=2.504898113 podStartE2EDuration="5.208720834s" podCreationTimestamp="2025-12-16 12:53:21 +0000 UTC" firstStartedPulling="2025-12-16 12:53:23.111088632 +0000 UTC m=+388.538832428" lastFinishedPulling="2025-12-16 12:53:25.814911353 +0000 UTC m=+391.242655149" observedRunningTime="2025-12-16 12:53:26.206237221 +0000 UTC m=+391.633981037" watchObservedRunningTime="2025-12-16 12:53:26.208720834 +0000 UTC m=+391.636464630" Dec 16 12:53:26 crc kubenswrapper[4757]: I1216 12:53:26.228625 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vnn8x" podStartSLOduration=2.56243133 podStartE2EDuration="5.228605128s" podCreationTimestamp="2025-12-16 12:53:21 +0000 UTC" firstStartedPulling="2025-12-16 12:53:23.104088323 +0000 UTC m=+388.531832119" lastFinishedPulling="2025-12-16 12:53:25.770262121 +0000 UTC m=+391.198005917" observedRunningTime="2025-12-16 12:53:26.226352789 +0000 UTC m=+391.654096585" watchObservedRunningTime="2025-12-16 12:53:26.228605128 +0000 UTC m=+391.656348924" Dec 16 12:53:29 crc kubenswrapper[4757]: I1216 12:53:29.708945 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tw282" Dec 16 12:53:29 crc kubenswrapper[4757]: I1216 12:53:29.710346 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tw282" Dec 16 12:53:29 crc kubenswrapper[4757]: I1216 12:53:29.717222 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cdz7w" Dec 16 12:53:29 crc kubenswrapper[4757]: I1216 12:53:29.717278 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cdz7w" Dec 16 12:53:29 crc kubenswrapper[4757]: I1216 12:53:29.759965 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tw282" Dec 16 12:53:29 crc kubenswrapper[4757]: I1216 12:53:29.762237 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cdz7w" Dec 16 12:53:30 crc kubenswrapper[4757]: I1216 12:53:30.257580 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tw282" Dec 16 12:53:30 crc kubenswrapper[4757]: I1216 12:53:30.265415 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cdz7w" Dec 16 12:53:31 crc kubenswrapper[4757]: I1216 12:53:31.636261 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9fpvh" Dec 16 12:53:31 crc kubenswrapper[4757]: I1216 12:53:31.636628 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9fpvh" Dec 16 12:53:31 crc kubenswrapper[4757]: I1216 12:53:31.687793 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9fpvh" Dec 16 12:53:31 crc kubenswrapper[4757]: I1216 12:53:31.822663 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vnn8x" Dec 16 12:53:31 crc kubenswrapper[4757]: I1216 12:53:31.823497 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vnn8x" Dec 16 12:53:31 crc kubenswrapper[4757]: I1216 12:53:31.869393 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vnn8x" Dec 16 12:53:32 crc kubenswrapper[4757]: I1216 12:53:32.272251 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vnn8x" Dec 16 12:53:32 crc kubenswrapper[4757]: I1216 12:53:32.285700 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9fpvh" Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.078356 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" podUID="7e7b566f-4c89-4834-ba16-f5e5286eda7e" containerName="registry" containerID="cri-o://0e686faaf5ea9d1d25d805eddab869b5f0eef7afb7b4bfc0d41d639b92da26c5" gracePeriod=30 Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.334036 4757 generic.go:334] "Generic (PLEG): container finished" podID="7e7b566f-4c89-4834-ba16-f5e5286eda7e" containerID="0e686faaf5ea9d1d25d805eddab869b5f0eef7afb7b4bfc0d41d639b92da26c5" exitCode=0 Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.334217 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" event={"ID":"7e7b566f-4c89-4834-ba16-f5e5286eda7e","Type":"ContainerDied","Data":"0e686faaf5ea9d1d25d805eddab869b5f0eef7afb7b4bfc0d41d639b92da26c5"} Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.500281 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.661792 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7e7b566f-4c89-4834-ba16-f5e5286eda7e-registry-certificates\") pod \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.661848 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7e7b566f-4c89-4834-ba16-f5e5286eda7e-installation-pull-secrets\") pod \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.662113 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.662148 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7e7b566f-4c89-4834-ba16-f5e5286eda7e-trusted-ca\") pod \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.662212 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7e7b566f-4c89-4834-ba16-f5e5286eda7e-ca-trust-extracted\") pod \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.662230 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e7b566f-4c89-4834-ba16-f5e5286eda7e-bound-sa-token\") pod \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.662247 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4bkw\" (UniqueName: \"kubernetes.io/projected/7e7b566f-4c89-4834-ba16-f5e5286eda7e-kube-api-access-h4bkw\") pod \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.662265 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e7b566f-4c89-4834-ba16-f5e5286eda7e-registry-tls\") pod \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\" (UID: \"7e7b566f-4c89-4834-ba16-f5e5286eda7e\") " Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.663259 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e7b566f-4c89-4834-ba16-f5e5286eda7e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7e7b566f-4c89-4834-ba16-f5e5286eda7e" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.663915 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e7b566f-4c89-4834-ba16-f5e5286eda7e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7e7b566f-4c89-4834-ba16-f5e5286eda7e" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.685411 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e7b566f-4c89-4834-ba16-f5e5286eda7e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7e7b566f-4c89-4834-ba16-f5e5286eda7e" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.687909 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e7b566f-4c89-4834-ba16-f5e5286eda7e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7e7b566f-4c89-4834-ba16-f5e5286eda7e" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.696299 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e7b566f-4c89-4834-ba16-f5e5286eda7e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7e7b566f-4c89-4834-ba16-f5e5286eda7e" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.696427 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e7b566f-4c89-4834-ba16-f5e5286eda7e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7e7b566f-4c89-4834-ba16-f5e5286eda7e" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.696877 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e7b566f-4c89-4834-ba16-f5e5286eda7e-kube-api-access-h4bkw" (OuterVolumeSpecName: "kube-api-access-h4bkw") pod "7e7b566f-4c89-4834-ba16-f5e5286eda7e" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e"). InnerVolumeSpecName "kube-api-access-h4bkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.697309 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7e7b566f-4c89-4834-ba16-f5e5286eda7e" (UID: "7e7b566f-4c89-4834-ba16-f5e5286eda7e"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.764114 4757 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e7b566f-4c89-4834-ba16-f5e5286eda7e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.764177 4757 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7e7b566f-4c89-4834-ba16-f5e5286eda7e-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.764190 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4bkw\" (UniqueName: \"kubernetes.io/projected/7e7b566f-4c89-4834-ba16-f5e5286eda7e-kube-api-access-h4bkw\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.764207 4757 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e7b566f-4c89-4834-ba16-f5e5286eda7e-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.764219 4757 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7e7b566f-4c89-4834-ba16-f5e5286eda7e-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.764231 4757 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7e7b566f-4c89-4834-ba16-f5e5286eda7e-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:51 crc kubenswrapper[4757]: I1216 12:53:51.764242 4757 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7e7b566f-4c89-4834-ba16-f5e5286eda7e-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 12:53:52 crc kubenswrapper[4757]: I1216 12:53:52.340770 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" event={"ID":"7e7b566f-4c89-4834-ba16-f5e5286eda7e","Type":"ContainerDied","Data":"908b405d01bbe842bb85ecfde842a2484e2dad2c3f6729feb88eb45f09f46ed9"} Dec 16 12:53:52 crc kubenswrapper[4757]: I1216 12:53:52.341177 4757 scope.go:117] "RemoveContainer" containerID="0e686faaf5ea9d1d25d805eddab869b5f0eef7afb7b4bfc0d41d639b92da26c5" Dec 16 12:53:52 crc kubenswrapper[4757]: I1216 12:53:52.341324 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ws9qr" Dec 16 12:53:52 crc kubenswrapper[4757]: I1216 12:53:52.377361 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ws9qr"] Dec 16 12:53:52 crc kubenswrapper[4757]: I1216 12:53:52.381308 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ws9qr"] Dec 16 12:53:52 crc kubenswrapper[4757]: I1216 12:53:52.956960 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e7b566f-4c89-4834-ba16-f5e5286eda7e" path="/var/lib/kubelet/pods/7e7b566f-4c89-4834-ba16-f5e5286eda7e/volumes" Dec 16 12:55:21 crc kubenswrapper[4757]: I1216 12:55:21.181627 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 12:55:21 crc kubenswrapper[4757]: I1216 12:55:21.182629 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 12:55:51 crc kubenswrapper[4757]: I1216 12:55:51.181608 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 12:55:51 crc kubenswrapper[4757]: I1216 12:55:51.182800 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 12:56:21 crc kubenswrapper[4757]: I1216 12:56:21.181362 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 12:56:21 crc kubenswrapper[4757]: I1216 12:56:21.182835 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 12:56:21 crc kubenswrapper[4757]: I1216 12:56:21.182948 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 12:56:21 crc kubenswrapper[4757]: I1216 12:56:21.184129 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"06f2518ad142487e4376906bd25ad7deb58807a073968183d666cf3c0c45d958"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 12:56:21 crc kubenswrapper[4757]: I1216 12:56:21.184289 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://06f2518ad142487e4376906bd25ad7deb58807a073968183d666cf3c0c45d958" gracePeriod=600 Dec 16 12:56:22 crc kubenswrapper[4757]: I1216 12:56:22.163484 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="06f2518ad142487e4376906bd25ad7deb58807a073968183d666cf3c0c45d958" exitCode=0 Dec 16 12:56:22 crc kubenswrapper[4757]: I1216 12:56:22.163570 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"06f2518ad142487e4376906bd25ad7deb58807a073968183d666cf3c0c45d958"} Dec 16 12:56:22 crc kubenswrapper[4757]: I1216 12:56:22.164194 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"8a775c60c16076b0ed545742e1f91801e3b31e7877ff7d29827c20b473cfd673"} Dec 16 12:56:22 crc kubenswrapper[4757]: I1216 12:56:22.164223 4757 scope.go:117] "RemoveContainer" containerID="caa3a93ad3bd3927512be6975f6d9bbe16d0438123c8248da65c133894f8be8b" Dec 16 12:56:55 crc kubenswrapper[4757]: I1216 12:56:55.110386 4757 scope.go:117] "RemoveContainer" containerID="fad05903fc52ace047ff7f12cb505827333a3ff7e7aff802b3c3b9f98b8cc4f6" Dec 16 12:58:21 crc kubenswrapper[4757]: I1216 12:58:21.181748 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 12:58:21 crc kubenswrapper[4757]: I1216 12:58:21.182348 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 12:58:30 crc kubenswrapper[4757]: I1216 12:58:30.835116 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-82cdt"] Dec 16 12:58:30 crc kubenswrapper[4757]: E1216 12:58:30.835916 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e7b566f-4c89-4834-ba16-f5e5286eda7e" containerName="registry" Dec 16 12:58:30 crc kubenswrapper[4757]: I1216 12:58:30.835935 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e7b566f-4c89-4834-ba16-f5e5286eda7e" containerName="registry" Dec 16 12:58:30 crc kubenswrapper[4757]: I1216 12:58:30.836088 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e7b566f-4c89-4834-ba16-f5e5286eda7e" containerName="registry" Dec 16 12:58:30 crc kubenswrapper[4757]: I1216 12:58:30.836576 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-82cdt" Dec 16 12:58:30 crc kubenswrapper[4757]: I1216 12:58:30.841163 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 16 12:58:30 crc kubenswrapper[4757]: I1216 12:58:30.841292 4757 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-nzjjm" Dec 16 12:58:30 crc kubenswrapper[4757]: I1216 12:58:30.842408 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 16 12:58:30 crc kubenswrapper[4757]: I1216 12:58:30.851472 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-dd8lt"] Dec 16 12:58:30 crc kubenswrapper[4757]: I1216 12:58:30.852350 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-dd8lt" Dec 16 12:58:30 crc kubenswrapper[4757]: I1216 12:58:30.855217 4757 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-pvd9n" Dec 16 12:58:30 crc kubenswrapper[4757]: I1216 12:58:30.860487 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-82cdt"] Dec 16 12:58:30 crc kubenswrapper[4757]: I1216 12:58:30.869329 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-cgxqx"] Dec 16 12:58:30 crc kubenswrapper[4757]: I1216 12:58:30.870268 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-cgxqx" Dec 16 12:58:30 crc kubenswrapper[4757]: I1216 12:58:30.871796 4757 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-8xlwh" Dec 16 12:58:30 crc kubenswrapper[4757]: I1216 12:58:30.879909 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-dd8lt"] Dec 16 12:58:30 crc kubenswrapper[4757]: I1216 12:58:30.891116 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-cgxqx"] Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.011597 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfjmr\" (UniqueName: \"kubernetes.io/projected/1a132cd7-a7ae-476a-ad05-9a2ec1981349-kube-api-access-lfjmr\") pod \"cert-manager-cainjector-7f985d654d-82cdt\" (UID: \"1a132cd7-a7ae-476a-ad05-9a2ec1981349\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-82cdt" Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.011679 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjbxx\" (UniqueName: \"kubernetes.io/projected/fc655451-6b29-42c9-836d-ae8ae9a5d77b-kube-api-access-hjbxx\") pod \"cert-manager-5b446d88c5-dd8lt\" (UID: \"fc655451-6b29-42c9-836d-ae8ae9a5d77b\") " pod="cert-manager/cert-manager-5b446d88c5-dd8lt" Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.011756 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwv7k\" (UniqueName: \"kubernetes.io/projected/04fe3c89-14a7-4830-b290-538d3ae20a12-kube-api-access-jwv7k\") pod \"cert-manager-webhook-5655c58dd6-cgxqx\" (UID: \"04fe3c89-14a7-4830-b290-538d3ae20a12\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-cgxqx" Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.112644 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwv7k\" (UniqueName: \"kubernetes.io/projected/04fe3c89-14a7-4830-b290-538d3ae20a12-kube-api-access-jwv7k\") pod \"cert-manager-webhook-5655c58dd6-cgxqx\" (UID: \"04fe3c89-14a7-4830-b290-538d3ae20a12\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-cgxqx" Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.112993 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfjmr\" (UniqueName: \"kubernetes.io/projected/1a132cd7-a7ae-476a-ad05-9a2ec1981349-kube-api-access-lfjmr\") pod \"cert-manager-cainjector-7f985d654d-82cdt\" (UID: \"1a132cd7-a7ae-476a-ad05-9a2ec1981349\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-82cdt" Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.113152 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjbxx\" (UniqueName: \"kubernetes.io/projected/fc655451-6b29-42c9-836d-ae8ae9a5d77b-kube-api-access-hjbxx\") pod \"cert-manager-5b446d88c5-dd8lt\" (UID: \"fc655451-6b29-42c9-836d-ae8ae9a5d77b\") " pod="cert-manager/cert-manager-5b446d88c5-dd8lt" Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.133102 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjbxx\" (UniqueName: \"kubernetes.io/projected/fc655451-6b29-42c9-836d-ae8ae9a5d77b-kube-api-access-hjbxx\") pod \"cert-manager-5b446d88c5-dd8lt\" (UID: \"fc655451-6b29-42c9-836d-ae8ae9a5d77b\") " pod="cert-manager/cert-manager-5b446d88c5-dd8lt" Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.134801 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfjmr\" (UniqueName: \"kubernetes.io/projected/1a132cd7-a7ae-476a-ad05-9a2ec1981349-kube-api-access-lfjmr\") pod \"cert-manager-cainjector-7f985d654d-82cdt\" (UID: \"1a132cd7-a7ae-476a-ad05-9a2ec1981349\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-82cdt" Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.138369 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwv7k\" (UniqueName: \"kubernetes.io/projected/04fe3c89-14a7-4830-b290-538d3ae20a12-kube-api-access-jwv7k\") pod \"cert-manager-webhook-5655c58dd6-cgxqx\" (UID: \"04fe3c89-14a7-4830-b290-538d3ae20a12\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-cgxqx" Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.154663 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-82cdt" Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.170357 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-dd8lt" Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.184569 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-cgxqx" Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.400847 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-dd8lt"] Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.416640 4757 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 12:58:31 crc kubenswrapper[4757]: W1216 12:58:31.454181 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a132cd7_a7ae_476a_ad05_9a2ec1981349.slice/crio-1e355598e7e750f8532dbef7dd5434c829bb06257cbd3e57432c3d40b01ca7dd WatchSource:0}: Error finding container 1e355598e7e750f8532dbef7dd5434c829bb06257cbd3e57432c3d40b01ca7dd: Status 404 returned error can't find the container with id 1e355598e7e750f8532dbef7dd5434c829bb06257cbd3e57432c3d40b01ca7dd Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.454627 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-82cdt"] Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.739618 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-cgxqx"] Dec 16 12:58:31 crc kubenswrapper[4757]: W1216 12:58:31.747078 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04fe3c89_14a7_4830_b290_538d3ae20a12.slice/crio-4f33cfa08312c523320d96cf2b3a42aa8e9a719716c861e4c3dda1c001992865 WatchSource:0}: Error finding container 4f33cfa08312c523320d96cf2b3a42aa8e9a719716c861e4c3dda1c001992865: Status 404 returned error can't find the container with id 4f33cfa08312c523320d96cf2b3a42aa8e9a719716c861e4c3dda1c001992865 Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.936237 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-dd8lt" event={"ID":"fc655451-6b29-42c9-836d-ae8ae9a5d77b","Type":"ContainerStarted","Data":"2b902223d7d47856d74a38129fa481f63350797caac0762bf702ff538f95e2be"} Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.937535 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-cgxqx" event={"ID":"04fe3c89-14a7-4830-b290-538d3ae20a12","Type":"ContainerStarted","Data":"4f33cfa08312c523320d96cf2b3a42aa8e9a719716c861e4c3dda1c001992865"} Dec 16 12:58:31 crc kubenswrapper[4757]: I1216 12:58:31.938554 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-82cdt" event={"ID":"1a132cd7-a7ae-476a-ad05-9a2ec1981349","Type":"ContainerStarted","Data":"1e355598e7e750f8532dbef7dd5434c829bb06257cbd3e57432c3d40b01ca7dd"} Dec 16 12:58:34 crc kubenswrapper[4757]: I1216 12:58:34.956438 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-82cdt" event={"ID":"1a132cd7-a7ae-476a-ad05-9a2ec1981349","Type":"ContainerStarted","Data":"3827d281a6adc85be4b12a037332495b3f0afcf5d23994b4add825147607ad81"} Dec 16 12:58:34 crc kubenswrapper[4757]: I1216 12:58:34.958699 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-dd8lt" event={"ID":"fc655451-6b29-42c9-836d-ae8ae9a5d77b","Type":"ContainerStarted","Data":"c818ec55256719825071ca1e9479033011bed453c20dcd0a1dfcb797884938c0"} Dec 16 12:58:34 crc kubenswrapper[4757]: I1216 12:58:34.964565 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-cgxqx" event={"ID":"04fe3c89-14a7-4830-b290-538d3ae20a12","Type":"ContainerStarted","Data":"1e1da61dffee459f4e961e507301be545d43e7748d9aec68aedad9c4b2b8ab3f"} Dec 16 12:58:34 crc kubenswrapper[4757]: I1216 12:58:34.964931 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-cgxqx" Dec 16 12:58:35 crc kubenswrapper[4757]: I1216 12:58:35.038360 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-82cdt" podStartSLOduration=1.873576746 podStartE2EDuration="5.038341682s" podCreationTimestamp="2025-12-16 12:58:30 +0000 UTC" firstStartedPulling="2025-12-16 12:58:31.465649046 +0000 UTC m=+696.893392842" lastFinishedPulling="2025-12-16 12:58:34.630413982 +0000 UTC m=+700.058157778" observedRunningTime="2025-12-16 12:58:35.024803107 +0000 UTC m=+700.452546903" watchObservedRunningTime="2025-12-16 12:58:35.038341682 +0000 UTC m=+700.466085478" Dec 16 12:58:35 crc kubenswrapper[4757]: I1216 12:58:35.040154 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-dd8lt" podStartSLOduration=1.832737307 podStartE2EDuration="5.040146049s" podCreationTimestamp="2025-12-16 12:58:30 +0000 UTC" firstStartedPulling="2025-12-16 12:58:31.416085299 +0000 UTC m=+696.843829095" lastFinishedPulling="2025-12-16 12:58:34.623494041 +0000 UTC m=+700.051237837" observedRunningTime="2025-12-16 12:58:35.039331587 +0000 UTC m=+700.467075383" watchObservedRunningTime="2025-12-16 12:58:35.040146049 +0000 UTC m=+700.467889855" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.187096 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-cgxqx" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.202873 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-cgxqx" podStartSLOduration=8.253056345 podStartE2EDuration="11.202855128s" podCreationTimestamp="2025-12-16 12:58:30 +0000 UTC" firstStartedPulling="2025-12-16 12:58:31.749434209 +0000 UTC m=+697.177178005" lastFinishedPulling="2025-12-16 12:58:34.699232992 +0000 UTC m=+700.126976788" observedRunningTime="2025-12-16 12:58:35.064877115 +0000 UTC m=+700.492620911" watchObservedRunningTime="2025-12-16 12:58:41.202855128 +0000 UTC m=+706.630598924" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.521357 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t465t"] Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.521731 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovn-controller" containerID="cri-o://8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154" gracePeriod=30 Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.521749 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302" gracePeriod=30 Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.521842 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovn-acl-logging" containerID="cri-o://0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151" gracePeriod=30 Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.521831 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="northd" containerID="cri-o://7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a" gracePeriod=30 Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.521878 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="sbdb" containerID="cri-o://89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36" gracePeriod=30 Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.521937 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="kube-rbac-proxy-node" containerID="cri-o://e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874" gracePeriod=30 Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.521763 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="nbdb" containerID="cri-o://ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946" gracePeriod=30 Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.574214 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovnkube-controller" containerID="cri-o://c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209" gracePeriod=30 Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.793240 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovnkube-controller/3.log" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.795426 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovn-acl-logging/0.log" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.795929 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovn-controller/0.log" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.796879 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.850500 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fb992"] Dec 16 12:58:41 crc kubenswrapper[4757]: E1216 12:58:41.851107 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovnkube-controller" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.851175 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovnkube-controller" Dec 16 12:58:41 crc kubenswrapper[4757]: E1216 12:58:41.851229 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovnkube-controller" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.851319 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovnkube-controller" Dec 16 12:58:41 crc kubenswrapper[4757]: E1216 12:58:41.851377 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="nbdb" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.851429 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="nbdb" Dec 16 12:58:41 crc kubenswrapper[4757]: E1216 12:58:41.851488 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="kube-rbac-proxy-ovn-metrics" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.851536 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="kube-rbac-proxy-ovn-metrics" Dec 16 12:58:41 crc kubenswrapper[4757]: E1216 12:58:41.851595 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="kubecfg-setup" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.852134 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="kubecfg-setup" Dec 16 12:58:41 crc kubenswrapper[4757]: E1216 12:58:41.852198 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="sbdb" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.852260 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="sbdb" Dec 16 12:58:41 crc kubenswrapper[4757]: E1216 12:58:41.852319 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovnkube-controller" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.852370 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovnkube-controller" Dec 16 12:58:41 crc kubenswrapper[4757]: E1216 12:58:41.852423 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovn-controller" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.852467 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovn-controller" Dec 16 12:58:41 crc kubenswrapper[4757]: E1216 12:58:41.852517 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovn-acl-logging" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.852569 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovn-acl-logging" Dec 16 12:58:41 crc kubenswrapper[4757]: E1216 12:58:41.852620 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="kube-rbac-proxy-node" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.852665 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="kube-rbac-proxy-node" Dec 16 12:58:41 crc kubenswrapper[4757]: E1216 12:58:41.852719 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="northd" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.852771 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="northd" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.852938 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovnkube-controller" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.852999 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="kube-rbac-proxy-ovn-metrics" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.853068 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovn-acl-logging" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.853268 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="sbdb" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.853336 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="northd" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.853403 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="kube-rbac-proxy-node" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.853465 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="nbdb" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.853516 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovnkube-controller" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.853569 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovnkube-controller" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.853617 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovnkube-controller" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.853661 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovn-controller" Dec 16 12:58:41 crc kubenswrapper[4757]: E1216 12:58:41.853826 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovnkube-controller" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.853882 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovnkube-controller" Dec 16 12:58:41 crc kubenswrapper[4757]: E1216 12:58:41.853934 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovnkube-controller" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.853987 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovnkube-controller" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.854183 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerName="ovnkube-controller" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.856109 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952579 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b876e35b-75f8-407e-bf25-f7b3c2024428-env-overrides\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952625 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b876e35b-75f8-407e-bf25-f7b3c2024428-ovn-node-metrics-cert\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952644 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-cni-bin\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952675 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b876e35b-75f8-407e-bf25-f7b3c2024428-ovnkube-script-lib\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952705 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-run-ovn\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952733 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-run-systemd\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952757 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-etc-openvswitch\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952777 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-slash\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952811 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b876e35b-75f8-407e-bf25-f7b3c2024428-ovnkube-config\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952830 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-run-ovn-kubernetes\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952848 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-log-socket\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952868 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-kubelet\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952894 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c58k\" (UniqueName: \"kubernetes.io/projected/b876e35b-75f8-407e-bf25-f7b3c2024428-kube-api-access-9c58k\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952920 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-var-lib-openvswitch\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952939 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-node-log\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952972 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-systemd-units\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952992 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-run-netns\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953027 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-var-lib-cni-networks-ovn-kubernetes\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953048 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-run-openvswitch\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953066 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-cni-netd\") pod \"b876e35b-75f8-407e-bf25-f7b3c2024428\" (UID: \"b876e35b-75f8-407e-bf25-f7b3c2024428\") " Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953201 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-cni-bin\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953230 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02e14920-9085-42c3-a5af-77a610e2985f-ovnkube-script-lib\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953262 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-run-systemd\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953285 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-run-ovn\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953308 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-kubelet\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953329 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953352 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-var-lib-openvswitch\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953376 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02e14920-9085-42c3-a5af-77a610e2985f-ovnkube-config\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953399 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m5mm\" (UniqueName: \"kubernetes.io/projected/02e14920-9085-42c3-a5af-77a610e2985f-kube-api-access-4m5mm\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953422 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-etc-openvswitch\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953450 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02e14920-9085-42c3-a5af-77a610e2985f-env-overrides\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953483 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-run-openvswitch\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953513 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-run-netns\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953534 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-systemd-units\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953556 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-slash\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953575 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-log-socket\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953594 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-cni-netd\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953616 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02e14920-9085-42c3-a5af-77a610e2985f-ovn-node-metrics-cert\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953640 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-node-log\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953666 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-run-ovn-kubernetes\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952890 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952934 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.952968 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953291 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b876e35b-75f8-407e-bf25-f7b3c2024428-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953305 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b876e35b-75f8-407e-bf25-f7b3c2024428-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953324 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-node-log" (OuterVolumeSpecName: "node-log") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953335 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-log-socket" (OuterVolumeSpecName: "log-socket") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953344 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953732 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953353 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-slash" (OuterVolumeSpecName: "host-slash") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953648 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b876e35b-75f8-407e-bf25-f7b3c2024428-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953673 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953763 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953783 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953805 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953823 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.953859 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.957975 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b876e35b-75f8-407e-bf25-f7b3c2024428-kube-api-access-9c58k" (OuterVolumeSpecName: "kube-api-access-9c58k") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "kube-api-access-9c58k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.958136 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b876e35b-75f8-407e-bf25-f7b3c2024428-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 12:58:41 crc kubenswrapper[4757]: I1216 12:58:41.965304 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "b876e35b-75f8-407e-bf25-f7b3c2024428" (UID: "b876e35b-75f8-407e-bf25-f7b3c2024428"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.005176 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovnkube-controller/3.log" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.007556 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovn-acl-logging/0.log" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008255 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t465t_b876e35b-75f8-407e-bf25-f7b3c2024428/ovn-controller/0.log" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008706 4757 generic.go:334] "Generic (PLEG): container finished" podID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerID="c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209" exitCode=0 Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008739 4757 generic.go:334] "Generic (PLEG): container finished" podID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerID="89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36" exitCode=0 Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008749 4757 generic.go:334] "Generic (PLEG): container finished" podID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerID="ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946" exitCode=0 Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008758 4757 generic.go:334] "Generic (PLEG): container finished" podID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerID="7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a" exitCode=0 Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008767 4757 generic.go:334] "Generic (PLEG): container finished" podID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerID="c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302" exitCode=0 Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008776 4757 generic.go:334] "Generic (PLEG): container finished" podID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerID="e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874" exitCode=0 Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008784 4757 generic.go:334] "Generic (PLEG): container finished" podID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerID="0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151" exitCode=143 Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008793 4757 generic.go:334] "Generic (PLEG): container finished" podID="b876e35b-75f8-407e-bf25-f7b3c2024428" containerID="8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154" exitCode=143 Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008792 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerDied","Data":"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008832 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerDied","Data":"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008849 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerDied","Data":"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008856 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008869 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerDied","Data":"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008882 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerDied","Data":"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008894 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerDied","Data":"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008906 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008921 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008928 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008934 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008941 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008947 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008953 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008959 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008965 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008975 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerDied","Data":"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008986 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008989 4757 scope.go:117] "RemoveContainer" containerID="c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.008996 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009111 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009122 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009130 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009137 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009145 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009153 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009160 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009167 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009178 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerDied","Data":"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009193 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009200 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009207 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009218 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009224 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009231 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009237 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009244 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009251 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009257 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009266 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t465t" event={"ID":"b876e35b-75f8-407e-bf25-f7b3c2024428","Type":"ContainerDied","Data":"f431c0d2658ec360c04077cc43e3ea314cd54336b9aab2385e3d68efaec17c91"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009276 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009284 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009291 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009298 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009304 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009311 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009321 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009328 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009335 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.009341 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.010654 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cz9q7_395610a4-58ca-497e-93a6-714bd6c111c1/kube-multus/2.log" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.014296 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cz9q7_395610a4-58ca-497e-93a6-714bd6c111c1/kube-multus/1.log" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.014582 4757 generic.go:334] "Generic (PLEG): container finished" podID="395610a4-58ca-497e-93a6-714bd6c111c1" containerID="04480320c664741eda6338e8db28abff320565456f1d04941da70ec65707aa77" exitCode=2 Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.014701 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cz9q7" event={"ID":"395610a4-58ca-497e-93a6-714bd6c111c1","Type":"ContainerDied","Data":"04480320c664741eda6338e8db28abff320565456f1d04941da70ec65707aa77"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.014783 4757 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e788f45ea79fe7471dd4c6124a4bd5c8025b259e461b847cb96dc6cd647203e3"} Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.015341 4757 scope.go:117] "RemoveContainer" containerID="04480320c664741eda6338e8db28abff320565456f1d04941da70ec65707aa77" Dec 16 12:58:42 crc kubenswrapper[4757]: E1216 12:58:42.015614 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-cz9q7_openshift-multus(395610a4-58ca-497e-93a6-714bd6c111c1)\"" pod="openshift-multus/multus-cz9q7" podUID="395610a4-58ca-497e-93a6-714bd6c111c1" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.023124 4757 scope.go:117] "RemoveContainer" containerID="d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.040704 4757 scope.go:117] "RemoveContainer" containerID="89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.054714 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t465t"] Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.054762 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-run-systemd\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.054714 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-run-systemd\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.054822 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-run-ovn\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.054847 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.054873 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-kubelet\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.054895 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-var-lib-openvswitch\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.054898 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-run-ovn\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.054916 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02e14920-9085-42c3-a5af-77a610e2985f-ovnkube-config\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.054939 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m5mm\" (UniqueName: \"kubernetes.io/projected/02e14920-9085-42c3-a5af-77a610e2985f-kube-api-access-4m5mm\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.054944 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.054967 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-etc-openvswitch\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.054976 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-var-lib-openvswitch\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.054997 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02e14920-9085-42c3-a5af-77a610e2985f-env-overrides\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055040 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-etc-openvswitch\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055052 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-run-openvswitch\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055083 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-run-netns\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055104 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-systemd-units\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055127 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-slash\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055147 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-log-socket\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055169 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-cni-netd\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055194 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02e14920-9085-42c3-a5af-77a610e2985f-ovn-node-metrics-cert\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055218 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-node-log\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055244 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-run-ovn-kubernetes\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055268 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-cni-bin\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055289 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02e14920-9085-42c3-a5af-77a610e2985f-ovnkube-script-lib\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055335 4757 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b876e35b-75f8-407e-bf25-f7b3c2024428-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055349 4757 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b876e35b-75f8-407e-bf25-f7b3c2024428-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055361 4757 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055372 4757 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b876e35b-75f8-407e-bf25-f7b3c2024428-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055384 4757 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055393 4757 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055397 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-slash\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055441 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-log-socket\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055477 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-cni-netd\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055868 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-run-ovn-kubernetes\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055918 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-node-log\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055950 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-cni-bin\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055976 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-run-openvswitch\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.054947 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-kubelet\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.056268 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02e14920-9085-42c3-a5af-77a610e2985f-ovnkube-script-lib\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.055405 4757 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.056312 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-host-run-netns\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.056331 4757 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-slash\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.056353 4757 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-log-socket\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.056372 4757 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b876e35b-75f8-407e-bf25-f7b3c2024428-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.056389 4757 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.056402 4757 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.056414 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c58k\" (UniqueName: \"kubernetes.io/projected/b876e35b-75f8-407e-bf25-f7b3c2024428-kube-api-access-9c58k\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.056426 4757 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.056439 4757 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-node-log\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.056441 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02e14920-9085-42c3-a5af-77a610e2985f-env-overrides\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.056452 4757 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.056464 4757 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.056477 4757 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.056489 4757 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.056503 4757 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b876e35b-75f8-407e-bf25-f7b3c2024428-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.056987 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02e14920-9085-42c3-a5af-77a610e2985f-systemd-units\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.057474 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02e14920-9085-42c3-a5af-77a610e2985f-ovnkube-config\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.059670 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02e14920-9085-42c3-a5af-77a610e2985f-ovn-node-metrics-cert\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.060048 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t465t"] Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.060481 4757 scope.go:117] "RemoveContainer" containerID="ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.069732 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m5mm\" (UniqueName: \"kubernetes.io/projected/02e14920-9085-42c3-a5af-77a610e2985f-kube-api-access-4m5mm\") pod \"ovnkube-node-fb992\" (UID: \"02e14920-9085-42c3-a5af-77a610e2985f\") " pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.072575 4757 scope.go:117] "RemoveContainer" containerID="7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.091459 4757 scope.go:117] "RemoveContainer" containerID="c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.102773 4757 scope.go:117] "RemoveContainer" containerID="e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.113766 4757 scope.go:117] "RemoveContainer" containerID="0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.124195 4757 scope.go:117] "RemoveContainer" containerID="8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.137091 4757 scope.go:117] "RemoveContainer" containerID="a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.147711 4757 scope.go:117] "RemoveContainer" containerID="c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209" Dec 16 12:58:42 crc kubenswrapper[4757]: E1216 12:58:42.148242 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209\": container with ID starting with c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209 not found: ID does not exist" containerID="c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.148275 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209"} err="failed to get container status \"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209\": rpc error: code = NotFound desc = could not find container \"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209\": container with ID starting with c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.148295 4757 scope.go:117] "RemoveContainer" containerID="d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d" Dec 16 12:58:42 crc kubenswrapper[4757]: E1216 12:58:42.148564 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d\": container with ID starting with d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d not found: ID does not exist" containerID="d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.148638 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d"} err="failed to get container status \"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d\": rpc error: code = NotFound desc = could not find container \"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d\": container with ID starting with d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.148698 4757 scope.go:117] "RemoveContainer" containerID="89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36" Dec 16 12:58:42 crc kubenswrapper[4757]: E1216 12:58:42.148968 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\": container with ID starting with 89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36 not found: ID does not exist" containerID="89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.149080 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36"} err="failed to get container status \"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\": rpc error: code = NotFound desc = could not find container \"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\": container with ID starting with 89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.149146 4757 scope.go:117] "RemoveContainer" containerID="ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946" Dec 16 12:58:42 crc kubenswrapper[4757]: E1216 12:58:42.149529 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\": container with ID starting with ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946 not found: ID does not exist" containerID="ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.149597 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946"} err="failed to get container status \"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\": rpc error: code = NotFound desc = could not find container \"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\": container with ID starting with ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.149659 4757 scope.go:117] "RemoveContainer" containerID="7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a" Dec 16 12:58:42 crc kubenswrapper[4757]: E1216 12:58:42.150057 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\": container with ID starting with 7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a not found: ID does not exist" containerID="7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.150123 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a"} err="failed to get container status \"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\": rpc error: code = NotFound desc = could not find container \"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\": container with ID starting with 7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.150189 4757 scope.go:117] "RemoveContainer" containerID="c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302" Dec 16 12:58:42 crc kubenswrapper[4757]: E1216 12:58:42.150538 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\": container with ID starting with c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302 not found: ID does not exist" containerID="c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.150604 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302"} err="failed to get container status \"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\": rpc error: code = NotFound desc = could not find container \"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\": container with ID starting with c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.150658 4757 scope.go:117] "RemoveContainer" containerID="e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874" Dec 16 12:58:42 crc kubenswrapper[4757]: E1216 12:58:42.150922 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\": container with ID starting with e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874 not found: ID does not exist" containerID="e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.150994 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874"} err="failed to get container status \"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\": rpc error: code = NotFound desc = could not find container \"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\": container with ID starting with e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.151080 4757 scope.go:117] "RemoveContainer" containerID="0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151" Dec 16 12:58:42 crc kubenswrapper[4757]: E1216 12:58:42.151360 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\": container with ID starting with 0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151 not found: ID does not exist" containerID="0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.151381 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151"} err="failed to get container status \"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\": rpc error: code = NotFound desc = could not find container \"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\": container with ID starting with 0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.151394 4757 scope.go:117] "RemoveContainer" containerID="8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154" Dec 16 12:58:42 crc kubenswrapper[4757]: E1216 12:58:42.151603 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\": container with ID starting with 8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154 not found: ID does not exist" containerID="8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.151671 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154"} err="failed to get container status \"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\": rpc error: code = NotFound desc = could not find container \"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\": container with ID starting with 8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.151728 4757 scope.go:117] "RemoveContainer" containerID="a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43" Dec 16 12:58:42 crc kubenswrapper[4757]: E1216 12:58:42.151977 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\": container with ID starting with a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43 not found: ID does not exist" containerID="a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.152071 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43"} err="failed to get container status \"a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\": rpc error: code = NotFound desc = could not find container \"a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\": container with ID starting with a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.152128 4757 scope.go:117] "RemoveContainer" containerID="c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.152381 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209"} err="failed to get container status \"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209\": rpc error: code = NotFound desc = could not find container \"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209\": container with ID starting with c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.152403 4757 scope.go:117] "RemoveContainer" containerID="d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.152587 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d"} err="failed to get container status \"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d\": rpc error: code = NotFound desc = could not find container \"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d\": container with ID starting with d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.152604 4757 scope.go:117] "RemoveContainer" containerID="89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.152779 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36"} err="failed to get container status \"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\": rpc error: code = NotFound desc = could not find container \"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\": container with ID starting with 89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.152846 4757 scope.go:117] "RemoveContainer" containerID="ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.153249 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946"} err="failed to get container status \"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\": rpc error: code = NotFound desc = could not find container \"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\": container with ID starting with ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.153265 4757 scope.go:117] "RemoveContainer" containerID="7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.153516 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a"} err="failed to get container status \"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\": rpc error: code = NotFound desc = could not find container \"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\": container with ID starting with 7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.153532 4757 scope.go:117] "RemoveContainer" containerID="c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.153765 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302"} err="failed to get container status \"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\": rpc error: code = NotFound desc = could not find container \"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\": container with ID starting with c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.153789 4757 scope.go:117] "RemoveContainer" containerID="e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.154040 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874"} err="failed to get container status \"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\": rpc error: code = NotFound desc = could not find container \"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\": container with ID starting with e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.154084 4757 scope.go:117] "RemoveContainer" containerID="0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.154329 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151"} err="failed to get container status \"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\": rpc error: code = NotFound desc = could not find container \"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\": container with ID starting with 0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.154354 4757 scope.go:117] "RemoveContainer" containerID="8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.154611 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154"} err="failed to get container status \"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\": rpc error: code = NotFound desc = could not find container \"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\": container with ID starting with 8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.154627 4757 scope.go:117] "RemoveContainer" containerID="a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.154903 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43"} err="failed to get container status \"a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\": rpc error: code = NotFound desc = could not find container \"a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\": container with ID starting with a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.154926 4757 scope.go:117] "RemoveContainer" containerID="c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.155214 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209"} err="failed to get container status \"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209\": rpc error: code = NotFound desc = could not find container \"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209\": container with ID starting with c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.155231 4757 scope.go:117] "RemoveContainer" containerID="d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.155494 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d"} err="failed to get container status \"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d\": rpc error: code = NotFound desc = could not find container \"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d\": container with ID starting with d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.155519 4757 scope.go:117] "RemoveContainer" containerID="89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.155748 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36"} err="failed to get container status \"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\": rpc error: code = NotFound desc = could not find container \"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\": container with ID starting with 89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.155765 4757 scope.go:117] "RemoveContainer" containerID="ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.156023 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946"} err="failed to get container status \"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\": rpc error: code = NotFound desc = could not find container \"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\": container with ID starting with ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.156047 4757 scope.go:117] "RemoveContainer" containerID="7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.156279 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a"} err="failed to get container status \"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\": rpc error: code = NotFound desc = could not find container \"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\": container with ID starting with 7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.156299 4757 scope.go:117] "RemoveContainer" containerID="c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.156531 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302"} err="failed to get container status \"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\": rpc error: code = NotFound desc = could not find container \"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\": container with ID starting with c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.156555 4757 scope.go:117] "RemoveContainer" containerID="e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.156825 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874"} err="failed to get container status \"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\": rpc error: code = NotFound desc = could not find container \"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\": container with ID starting with e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.156842 4757 scope.go:117] "RemoveContainer" containerID="0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.157115 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151"} err="failed to get container status \"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\": rpc error: code = NotFound desc = could not find container \"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\": container with ID starting with 0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.157140 4757 scope.go:117] "RemoveContainer" containerID="8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.157460 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154"} err="failed to get container status \"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\": rpc error: code = NotFound desc = could not find container \"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\": container with ID starting with 8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.157477 4757 scope.go:117] "RemoveContainer" containerID="a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.157737 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43"} err="failed to get container status \"a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\": rpc error: code = NotFound desc = could not find container \"a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\": container with ID starting with a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.157805 4757 scope.go:117] "RemoveContainer" containerID="c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.158191 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209"} err="failed to get container status \"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209\": rpc error: code = NotFound desc = could not find container \"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209\": container with ID starting with c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.158260 4757 scope.go:117] "RemoveContainer" containerID="d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.158539 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d"} err="failed to get container status \"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d\": rpc error: code = NotFound desc = could not find container \"d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d\": container with ID starting with d576694f18d7da47941f4bcdc2a19030952aabe6a4afd3b0a61ad938b593d18d not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.158609 4757 scope.go:117] "RemoveContainer" containerID="89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.158894 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36"} err="failed to get container status \"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\": rpc error: code = NotFound desc = could not find container \"89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36\": container with ID starting with 89d735f53536645e4f104e450c8c30fad80c57fbff867c311b7de2b9b5161f36 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.158918 4757 scope.go:117] "RemoveContainer" containerID="ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.159170 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946"} err="failed to get container status \"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\": rpc error: code = NotFound desc = could not find container \"ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946\": container with ID starting with ad2c3f5c3949c8a1c655dd37b70f3053738db604ad3684193741acc8bcc20946 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.159192 4757 scope.go:117] "RemoveContainer" containerID="7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.159402 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a"} err="failed to get container status \"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\": rpc error: code = NotFound desc = could not find container \"7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a\": container with ID starting with 7beeee378133d13aac9b930f202e143ec7e8d0c74f4c9ff832086864d483504a not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.159422 4757 scope.go:117] "RemoveContainer" containerID="c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.159646 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302"} err="failed to get container status \"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\": rpc error: code = NotFound desc = could not find container \"c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302\": container with ID starting with c4ca9fe23dc8ed670a74ac4792e201f7186757d159817cafc38cb9f30bf59302 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.159666 4757 scope.go:117] "RemoveContainer" containerID="e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.159864 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874"} err="failed to get container status \"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\": rpc error: code = NotFound desc = could not find container \"e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874\": container with ID starting with e102130aa1c728e5bca0a82651413eef3cc8024d1eb28e2c347c6b4a5163d874 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.159884 4757 scope.go:117] "RemoveContainer" containerID="0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.160118 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151"} err="failed to get container status \"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\": rpc error: code = NotFound desc = could not find container \"0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151\": container with ID starting with 0e58cae2eb8ceeda2b5316becc18c60e5f7bb5385ae6455f0143e3ddc5906151 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.160139 4757 scope.go:117] "RemoveContainer" containerID="8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.160341 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154"} err="failed to get container status \"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\": rpc error: code = NotFound desc = could not find container \"8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154\": container with ID starting with 8842f169c0bc4a6ee79c7a95c90b97f47331e12fe192948f7a8b4d3cf155c154 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.160362 4757 scope.go:117] "RemoveContainer" containerID="a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.160572 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43"} err="failed to get container status \"a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\": rpc error: code = NotFound desc = could not find container \"a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43\": container with ID starting with a3a56b1e865b8eaa2eda2d6789029ffcc1ccac21fd4afabc67623581b9330a43 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.160662 4757 scope.go:117] "RemoveContainer" containerID="c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.161035 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209"} err="failed to get container status \"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209\": rpc error: code = NotFound desc = could not find container \"c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209\": container with ID starting with c4294623df0a0a11c6343e92ea7b5a9b855e6598f738b54419a89ab1f2e8d209 not found: ID does not exist" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.171352 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:42 crc kubenswrapper[4757]: I1216 12:58:42.955472 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b876e35b-75f8-407e-bf25-f7b3c2024428" path="/var/lib/kubelet/pods/b876e35b-75f8-407e-bf25-f7b3c2024428/volumes" Dec 16 12:58:43 crc kubenswrapper[4757]: I1216 12:58:43.023288 4757 generic.go:334] "Generic (PLEG): container finished" podID="02e14920-9085-42c3-a5af-77a610e2985f" containerID="bec9995408c6c7980bc368fc820227f85d04fa8d166ff8f1a481ccff165accef" exitCode=0 Dec 16 12:58:43 crc kubenswrapper[4757]: I1216 12:58:43.023339 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" event={"ID":"02e14920-9085-42c3-a5af-77a610e2985f","Type":"ContainerDied","Data":"bec9995408c6c7980bc368fc820227f85d04fa8d166ff8f1a481ccff165accef"} Dec 16 12:58:43 crc kubenswrapper[4757]: I1216 12:58:43.023382 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" event={"ID":"02e14920-9085-42c3-a5af-77a610e2985f","Type":"ContainerStarted","Data":"0900e5f938a45b3f02e8f4f6f682410a462f49d05e3a1eac2d2cb33c3f613691"} Dec 16 12:58:44 crc kubenswrapper[4757]: I1216 12:58:44.031231 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" event={"ID":"02e14920-9085-42c3-a5af-77a610e2985f","Type":"ContainerStarted","Data":"5eaf0fda79567f62c063ace626fae1520e120b849c2c7c3c3ac25983dcaf2e7a"} Dec 16 12:58:44 crc kubenswrapper[4757]: I1216 12:58:44.031572 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" event={"ID":"02e14920-9085-42c3-a5af-77a610e2985f","Type":"ContainerStarted","Data":"ed9cc6f7cc93ec1776f0193970f2f9054e3fcd1e5d8e9fbf47f75fea0c1caae9"} Dec 16 12:58:44 crc kubenswrapper[4757]: I1216 12:58:44.031587 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" event={"ID":"02e14920-9085-42c3-a5af-77a610e2985f","Type":"ContainerStarted","Data":"675dcebb4f3cef5bb1d460aeacd7810d2888707ea1393cbb1e69d09185e321bd"} Dec 16 12:58:44 crc kubenswrapper[4757]: I1216 12:58:44.031595 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" event={"ID":"02e14920-9085-42c3-a5af-77a610e2985f","Type":"ContainerStarted","Data":"09107da3adfa70b2436402d0d787175b179bff670a8d36b252c4fa5f54c05617"} Dec 16 12:58:44 crc kubenswrapper[4757]: I1216 12:58:44.031605 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" event={"ID":"02e14920-9085-42c3-a5af-77a610e2985f","Type":"ContainerStarted","Data":"1e2d45cbcc23aa2a40f16e9a14c12ecc97464d478e3b232552ed6b86ac6cea4d"} Dec 16 12:58:44 crc kubenswrapper[4757]: I1216 12:58:44.031613 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" event={"ID":"02e14920-9085-42c3-a5af-77a610e2985f","Type":"ContainerStarted","Data":"ec435ac682a3f50e0fbae791f93053ac514e57579c5aec6f82fdb1ec9d5889ea"} Dec 16 12:58:46 crc kubenswrapper[4757]: I1216 12:58:46.051047 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" event={"ID":"02e14920-9085-42c3-a5af-77a610e2985f","Type":"ContainerStarted","Data":"855c515e8fedbfc5ddc1782d083070b517a46885a9b008f866dc9b110380ba01"} Dec 16 12:58:48 crc kubenswrapper[4757]: I1216 12:58:48.068815 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" event={"ID":"02e14920-9085-42c3-a5af-77a610e2985f","Type":"ContainerStarted","Data":"04741e3aab613cb559a5a5f1475ffa96a6af55f3949b3c01de336c2d86af7ccd"} Dec 16 12:58:48 crc kubenswrapper[4757]: I1216 12:58:48.069294 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:48 crc kubenswrapper[4757]: I1216 12:58:48.069314 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:48 crc kubenswrapper[4757]: I1216 12:58:48.069325 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:48 crc kubenswrapper[4757]: I1216 12:58:48.098780 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" podStartSLOduration=7.098762142 podStartE2EDuration="7.098762142s" podCreationTimestamp="2025-12-16 12:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:58:48.097240522 +0000 UTC m=+713.524984408" watchObservedRunningTime="2025-12-16 12:58:48.098762142 +0000 UTC m=+713.526505938" Dec 16 12:58:48 crc kubenswrapper[4757]: I1216 12:58:48.114942 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:48 crc kubenswrapper[4757]: I1216 12:58:48.127670 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:58:51 crc kubenswrapper[4757]: I1216 12:58:51.181861 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 12:58:51 crc kubenswrapper[4757]: I1216 12:58:51.183088 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 12:58:52 crc kubenswrapper[4757]: I1216 12:58:52.949064 4757 scope.go:117] "RemoveContainer" containerID="04480320c664741eda6338e8db28abff320565456f1d04941da70ec65707aa77" Dec 16 12:58:52 crc kubenswrapper[4757]: E1216 12:58:52.949540 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-cz9q7_openshift-multus(395610a4-58ca-497e-93a6-714bd6c111c1)\"" pod="openshift-multus/multus-cz9q7" podUID="395610a4-58ca-497e-93a6-714bd6c111c1" Dec 16 12:58:55 crc kubenswrapper[4757]: I1216 12:58:55.162606 4757 scope.go:117] "RemoveContainer" containerID="e788f45ea79fe7471dd4c6124a4bd5c8025b259e461b847cb96dc6cd647203e3" Dec 16 12:58:56 crc kubenswrapper[4757]: I1216 12:58:56.110943 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cz9q7_395610a4-58ca-497e-93a6-714bd6c111c1/kube-multus/2.log" Dec 16 12:59:03 crc kubenswrapper[4757]: I1216 12:59:03.948837 4757 scope.go:117] "RemoveContainer" containerID="04480320c664741eda6338e8db28abff320565456f1d04941da70ec65707aa77" Dec 16 12:59:05 crc kubenswrapper[4757]: I1216 12:59:05.152747 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cz9q7_395610a4-58ca-497e-93a6-714bd6c111c1/kube-multus/2.log" Dec 16 12:59:05 crc kubenswrapper[4757]: I1216 12:59:05.153010 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cz9q7" event={"ID":"395610a4-58ca-497e-93a6-714bd6c111c1","Type":"ContainerStarted","Data":"ca261735ee5d334f09beac44aadaa78196f6ce14bab2a3f6aea4dae054fe2901"} Dec 16 12:59:12 crc kubenswrapper[4757]: I1216 12:59:12.194042 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fb992" Dec 16 12:59:21 crc kubenswrapper[4757]: I1216 12:59:21.180914 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 12:59:21 crc kubenswrapper[4757]: I1216 12:59:21.181519 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 12:59:21 crc kubenswrapper[4757]: I1216 12:59:21.181570 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 12:59:21 crc kubenswrapper[4757]: I1216 12:59:21.182895 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8a775c60c16076b0ed545742e1f91801e3b31e7877ff7d29827c20b473cfd673"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 12:59:21 crc kubenswrapper[4757]: I1216 12:59:21.182981 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://8a775c60c16076b0ed545742e1f91801e3b31e7877ff7d29827c20b473cfd673" gracePeriod=600 Dec 16 12:59:22 crc kubenswrapper[4757]: I1216 12:59:22.260832 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="8a775c60c16076b0ed545742e1f91801e3b31e7877ff7d29827c20b473cfd673" exitCode=0 Dec 16 12:59:22 crc kubenswrapper[4757]: I1216 12:59:22.260919 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"8a775c60c16076b0ed545742e1f91801e3b31e7877ff7d29827c20b473cfd673"} Dec 16 12:59:22 crc kubenswrapper[4757]: I1216 12:59:22.261825 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"2a23f8e521631b063ae4952d912ce6130192fc2c50ebd364a75b084a90f4b969"} Dec 16 12:59:22 crc kubenswrapper[4757]: I1216 12:59:22.261859 4757 scope.go:117] "RemoveContainer" containerID="06f2518ad142487e4376906bd25ad7deb58807a073968183d666cf3c0c45d958" Dec 16 12:59:23 crc kubenswrapper[4757]: I1216 12:59:23.645957 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct"] Dec 16 12:59:23 crc kubenswrapper[4757]: I1216 12:59:23.647288 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct" Dec 16 12:59:23 crc kubenswrapper[4757]: I1216 12:59:23.649378 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 16 12:59:23 crc kubenswrapper[4757]: I1216 12:59:23.658187 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct"] Dec 16 12:59:23 crc kubenswrapper[4757]: I1216 12:59:23.777146 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29xjt\" (UniqueName: \"kubernetes.io/projected/4aeee37c-9135-4d1a-8e52-181ea394acc6-kube-api-access-29xjt\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct\" (UID: \"4aeee37c-9135-4d1a-8e52-181ea394acc6\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct" Dec 16 12:59:23 crc kubenswrapper[4757]: I1216 12:59:23.777222 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4aeee37c-9135-4d1a-8e52-181ea394acc6-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct\" (UID: \"4aeee37c-9135-4d1a-8e52-181ea394acc6\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct" Dec 16 12:59:23 crc kubenswrapper[4757]: I1216 12:59:23.777289 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4aeee37c-9135-4d1a-8e52-181ea394acc6-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct\" (UID: \"4aeee37c-9135-4d1a-8e52-181ea394acc6\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct" Dec 16 12:59:23 crc kubenswrapper[4757]: I1216 12:59:23.878554 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29xjt\" (UniqueName: \"kubernetes.io/projected/4aeee37c-9135-4d1a-8e52-181ea394acc6-kube-api-access-29xjt\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct\" (UID: \"4aeee37c-9135-4d1a-8e52-181ea394acc6\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct" Dec 16 12:59:23 crc kubenswrapper[4757]: I1216 12:59:23.878631 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4aeee37c-9135-4d1a-8e52-181ea394acc6-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct\" (UID: \"4aeee37c-9135-4d1a-8e52-181ea394acc6\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct" Dec 16 12:59:23 crc kubenswrapper[4757]: I1216 12:59:23.878686 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4aeee37c-9135-4d1a-8e52-181ea394acc6-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct\" (UID: \"4aeee37c-9135-4d1a-8e52-181ea394acc6\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct" Dec 16 12:59:23 crc kubenswrapper[4757]: I1216 12:59:23.879200 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4aeee37c-9135-4d1a-8e52-181ea394acc6-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct\" (UID: \"4aeee37c-9135-4d1a-8e52-181ea394acc6\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct" Dec 16 12:59:23 crc kubenswrapper[4757]: I1216 12:59:23.879320 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4aeee37c-9135-4d1a-8e52-181ea394acc6-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct\" (UID: \"4aeee37c-9135-4d1a-8e52-181ea394acc6\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct" Dec 16 12:59:23 crc kubenswrapper[4757]: I1216 12:59:23.897262 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29xjt\" (UniqueName: \"kubernetes.io/projected/4aeee37c-9135-4d1a-8e52-181ea394acc6-kube-api-access-29xjt\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct\" (UID: \"4aeee37c-9135-4d1a-8e52-181ea394acc6\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct" Dec 16 12:59:23 crc kubenswrapper[4757]: I1216 12:59:23.964162 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct" Dec 16 12:59:24 crc kubenswrapper[4757]: I1216 12:59:24.168139 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct"] Dec 16 12:59:24 crc kubenswrapper[4757]: W1216 12:59:24.186249 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aeee37c_9135_4d1a_8e52_181ea394acc6.slice/crio-1fce15b14297f9e57c2c6344e5dbb5e6722849c6da34da4d837e659b062599ae WatchSource:0}: Error finding container 1fce15b14297f9e57c2c6344e5dbb5e6722849c6da34da4d837e659b062599ae: Status 404 returned error can't find the container with id 1fce15b14297f9e57c2c6344e5dbb5e6722849c6da34da4d837e659b062599ae Dec 16 12:59:24 crc kubenswrapper[4757]: I1216 12:59:24.272795 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct" event={"ID":"4aeee37c-9135-4d1a-8e52-181ea394acc6","Type":"ContainerStarted","Data":"1fce15b14297f9e57c2c6344e5dbb5e6722849c6da34da4d837e659b062599ae"} Dec 16 12:59:25 crc kubenswrapper[4757]: I1216 12:59:25.279565 4757 generic.go:334] "Generic (PLEG): container finished" podID="4aeee37c-9135-4d1a-8e52-181ea394acc6" containerID="11e7a77c02a672c71073a9ca0bc84238b125f795fe44c097111c6645078d5056" exitCode=0 Dec 16 12:59:25 crc kubenswrapper[4757]: I1216 12:59:25.279614 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct" event={"ID":"4aeee37c-9135-4d1a-8e52-181ea394acc6","Type":"ContainerDied","Data":"11e7a77c02a672c71073a9ca0bc84238b125f795fe44c097111c6645078d5056"} Dec 16 12:59:25 crc kubenswrapper[4757]: I1216 12:59:25.497536 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-phkjc"] Dec 16 12:59:25 crc kubenswrapper[4757]: I1216 12:59:25.498629 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phkjc" Dec 16 12:59:25 crc kubenswrapper[4757]: I1216 12:59:25.518636 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phkjc"] Dec 16 12:59:25 crc kubenswrapper[4757]: I1216 12:59:25.607748 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8wck\" (UniqueName: \"kubernetes.io/projected/1f037065-1189-4342-8d2c-82093767998a-kube-api-access-g8wck\") pod \"redhat-operators-phkjc\" (UID: \"1f037065-1189-4342-8d2c-82093767998a\") " pod="openshift-marketplace/redhat-operators-phkjc" Dec 16 12:59:25 crc kubenswrapper[4757]: I1216 12:59:25.607792 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f037065-1189-4342-8d2c-82093767998a-catalog-content\") pod \"redhat-operators-phkjc\" (UID: \"1f037065-1189-4342-8d2c-82093767998a\") " pod="openshift-marketplace/redhat-operators-phkjc" Dec 16 12:59:25 crc kubenswrapper[4757]: I1216 12:59:25.607847 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f037065-1189-4342-8d2c-82093767998a-utilities\") pod \"redhat-operators-phkjc\" (UID: \"1f037065-1189-4342-8d2c-82093767998a\") " pod="openshift-marketplace/redhat-operators-phkjc" Dec 16 12:59:25 crc kubenswrapper[4757]: I1216 12:59:25.709054 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f037065-1189-4342-8d2c-82093767998a-utilities\") pod \"redhat-operators-phkjc\" (UID: \"1f037065-1189-4342-8d2c-82093767998a\") " pod="openshift-marketplace/redhat-operators-phkjc" Dec 16 12:59:25 crc kubenswrapper[4757]: I1216 12:59:25.709160 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8wck\" (UniqueName: \"kubernetes.io/projected/1f037065-1189-4342-8d2c-82093767998a-kube-api-access-g8wck\") pod \"redhat-operators-phkjc\" (UID: \"1f037065-1189-4342-8d2c-82093767998a\") " pod="openshift-marketplace/redhat-operators-phkjc" Dec 16 12:59:25 crc kubenswrapper[4757]: I1216 12:59:25.709185 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f037065-1189-4342-8d2c-82093767998a-catalog-content\") pod \"redhat-operators-phkjc\" (UID: \"1f037065-1189-4342-8d2c-82093767998a\") " pod="openshift-marketplace/redhat-operators-phkjc" Dec 16 12:59:25 crc kubenswrapper[4757]: I1216 12:59:25.709863 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f037065-1189-4342-8d2c-82093767998a-utilities\") pod \"redhat-operators-phkjc\" (UID: \"1f037065-1189-4342-8d2c-82093767998a\") " pod="openshift-marketplace/redhat-operators-phkjc" Dec 16 12:59:25 crc kubenswrapper[4757]: I1216 12:59:25.709884 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f037065-1189-4342-8d2c-82093767998a-catalog-content\") pod \"redhat-operators-phkjc\" (UID: \"1f037065-1189-4342-8d2c-82093767998a\") " pod="openshift-marketplace/redhat-operators-phkjc" Dec 16 12:59:25 crc kubenswrapper[4757]: I1216 12:59:25.740231 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8wck\" (UniqueName: \"kubernetes.io/projected/1f037065-1189-4342-8d2c-82093767998a-kube-api-access-g8wck\") pod \"redhat-operators-phkjc\" (UID: \"1f037065-1189-4342-8d2c-82093767998a\") " pod="openshift-marketplace/redhat-operators-phkjc" Dec 16 12:59:25 crc kubenswrapper[4757]: I1216 12:59:25.815618 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phkjc" Dec 16 12:59:26 crc kubenswrapper[4757]: I1216 12:59:26.021989 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phkjc"] Dec 16 12:59:26 crc kubenswrapper[4757]: W1216 12:59:26.026416 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f037065_1189_4342_8d2c_82093767998a.slice/crio-c2df94826bc72a48cef4bdafd3fbb442a3ada5e111e1df23220ed4f233bfe93f WatchSource:0}: Error finding container c2df94826bc72a48cef4bdafd3fbb442a3ada5e111e1df23220ed4f233bfe93f: Status 404 returned error can't find the container with id c2df94826bc72a48cef4bdafd3fbb442a3ada5e111e1df23220ed4f233bfe93f Dec 16 12:59:26 crc kubenswrapper[4757]: I1216 12:59:26.287555 4757 generic.go:334] "Generic (PLEG): container finished" podID="1f037065-1189-4342-8d2c-82093767998a" containerID="a57bb43abf6cf526d34fdbd57c95c8787612aa521fe8204b16ab5001ba88ed86" exitCode=0 Dec 16 12:59:26 crc kubenswrapper[4757]: I1216 12:59:26.287597 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkjc" event={"ID":"1f037065-1189-4342-8d2c-82093767998a","Type":"ContainerDied","Data":"a57bb43abf6cf526d34fdbd57c95c8787612aa521fe8204b16ab5001ba88ed86"} Dec 16 12:59:26 crc kubenswrapper[4757]: I1216 12:59:26.287624 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkjc" event={"ID":"1f037065-1189-4342-8d2c-82093767998a","Type":"ContainerStarted","Data":"c2df94826bc72a48cef4bdafd3fbb442a3ada5e111e1df23220ed4f233bfe93f"} Dec 16 12:59:26 crc kubenswrapper[4757]: I1216 12:59:26.637496 4757 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 16 12:59:27 crc kubenswrapper[4757]: I1216 12:59:27.295562 4757 generic.go:334] "Generic (PLEG): container finished" podID="4aeee37c-9135-4d1a-8e52-181ea394acc6" containerID="8662e96283c8ccddfaf8c514c1ce0b638c192de1c263c6f5faea78d1d89c1f6e" exitCode=0 Dec 16 12:59:27 crc kubenswrapper[4757]: I1216 12:59:27.295654 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct" event={"ID":"4aeee37c-9135-4d1a-8e52-181ea394acc6","Type":"ContainerDied","Data":"8662e96283c8ccddfaf8c514c1ce0b638c192de1c263c6f5faea78d1d89c1f6e"} Dec 16 12:59:27 crc kubenswrapper[4757]: I1216 12:59:27.297911 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkjc" event={"ID":"1f037065-1189-4342-8d2c-82093767998a","Type":"ContainerStarted","Data":"cb00d3628afd3f305e59bed014ab093671787785793a63af4500b77e7f2a81c7"} Dec 16 12:59:28 crc kubenswrapper[4757]: I1216 12:59:28.308528 4757 generic.go:334] "Generic (PLEG): container finished" podID="4aeee37c-9135-4d1a-8e52-181ea394acc6" containerID="0a268b8b9c3bb56e7dc1ecec96b6ab82b286fd1a1424a7d56d49d1d5c9a295d2" exitCode=0 Dec 16 12:59:28 crc kubenswrapper[4757]: I1216 12:59:28.308620 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct" event={"ID":"4aeee37c-9135-4d1a-8e52-181ea394acc6","Type":"ContainerDied","Data":"0a268b8b9c3bb56e7dc1ecec96b6ab82b286fd1a1424a7d56d49d1d5c9a295d2"} Dec 16 12:59:28 crc kubenswrapper[4757]: I1216 12:59:28.310914 4757 generic.go:334] "Generic (PLEG): container finished" podID="1f037065-1189-4342-8d2c-82093767998a" containerID="cb00d3628afd3f305e59bed014ab093671787785793a63af4500b77e7f2a81c7" exitCode=0 Dec 16 12:59:28 crc kubenswrapper[4757]: I1216 12:59:28.310954 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkjc" event={"ID":"1f037065-1189-4342-8d2c-82093767998a","Type":"ContainerDied","Data":"cb00d3628afd3f305e59bed014ab093671787785793a63af4500b77e7f2a81c7"} Dec 16 12:59:29 crc kubenswrapper[4757]: I1216 12:59:29.316565 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkjc" event={"ID":"1f037065-1189-4342-8d2c-82093767998a","Type":"ContainerStarted","Data":"913c5a6c2fb7201e77c61a04935e31becf1ca2e03ad007183c5937e98e971538"} Dec 16 12:59:29 crc kubenswrapper[4757]: I1216 12:59:29.607453 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct" Dec 16 12:59:29 crc kubenswrapper[4757]: I1216 12:59:29.638134 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-phkjc" podStartSLOduration=2.029879249 podStartE2EDuration="4.638113917s" podCreationTimestamp="2025-12-16 12:59:25 +0000 UTC" firstStartedPulling="2025-12-16 12:59:26.289666029 +0000 UTC m=+751.717409825" lastFinishedPulling="2025-12-16 12:59:28.897900697 +0000 UTC m=+754.325644493" observedRunningTime="2025-12-16 12:59:29.337535398 +0000 UTC m=+754.765279194" watchObservedRunningTime="2025-12-16 12:59:29.638113917 +0000 UTC m=+755.065857713" Dec 16 12:59:29 crc kubenswrapper[4757]: I1216 12:59:29.657139 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4aeee37c-9135-4d1a-8e52-181ea394acc6-util\") pod \"4aeee37c-9135-4d1a-8e52-181ea394acc6\" (UID: \"4aeee37c-9135-4d1a-8e52-181ea394acc6\") " Dec 16 12:59:29 crc kubenswrapper[4757]: I1216 12:59:29.657219 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29xjt\" (UniqueName: \"kubernetes.io/projected/4aeee37c-9135-4d1a-8e52-181ea394acc6-kube-api-access-29xjt\") pod \"4aeee37c-9135-4d1a-8e52-181ea394acc6\" (UID: \"4aeee37c-9135-4d1a-8e52-181ea394acc6\") " Dec 16 12:59:29 crc kubenswrapper[4757]: I1216 12:59:29.657256 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4aeee37c-9135-4d1a-8e52-181ea394acc6-bundle\") pod \"4aeee37c-9135-4d1a-8e52-181ea394acc6\" (UID: \"4aeee37c-9135-4d1a-8e52-181ea394acc6\") " Dec 16 12:59:29 crc kubenswrapper[4757]: I1216 12:59:29.658233 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aeee37c-9135-4d1a-8e52-181ea394acc6-bundle" (OuterVolumeSpecName: "bundle") pod "4aeee37c-9135-4d1a-8e52-181ea394acc6" (UID: "4aeee37c-9135-4d1a-8e52-181ea394acc6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:59:29 crc kubenswrapper[4757]: I1216 12:59:29.664156 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aeee37c-9135-4d1a-8e52-181ea394acc6-kube-api-access-29xjt" (OuterVolumeSpecName: "kube-api-access-29xjt") pod "4aeee37c-9135-4d1a-8e52-181ea394acc6" (UID: "4aeee37c-9135-4d1a-8e52-181ea394acc6"). InnerVolumeSpecName "kube-api-access-29xjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:59:29 crc kubenswrapper[4757]: I1216 12:59:29.758506 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29xjt\" (UniqueName: \"kubernetes.io/projected/4aeee37c-9135-4d1a-8e52-181ea394acc6-kube-api-access-29xjt\") on node \"crc\" DevicePath \"\"" Dec 16 12:59:29 crc kubenswrapper[4757]: I1216 12:59:29.758541 4757 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4aeee37c-9135-4d1a-8e52-181ea394acc6-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 12:59:29 crc kubenswrapper[4757]: I1216 12:59:29.962949 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aeee37c-9135-4d1a-8e52-181ea394acc6-util" (OuterVolumeSpecName: "util") pod "4aeee37c-9135-4d1a-8e52-181ea394acc6" (UID: "4aeee37c-9135-4d1a-8e52-181ea394acc6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:59:30 crc kubenswrapper[4757]: I1216 12:59:30.062292 4757 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4aeee37c-9135-4d1a-8e52-181ea394acc6-util\") on node \"crc\" DevicePath \"\"" Dec 16 12:59:30 crc kubenswrapper[4757]: I1216 12:59:30.327671 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct" event={"ID":"4aeee37c-9135-4d1a-8e52-181ea394acc6","Type":"ContainerDied","Data":"1fce15b14297f9e57c2c6344e5dbb5e6722849c6da34da4d837e659b062599ae"} Dec 16 12:59:30 crc kubenswrapper[4757]: I1216 12:59:30.327716 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fce15b14297f9e57c2c6344e5dbb5e6722849c6da34da4d837e659b062599ae" Dec 16 12:59:30 crc kubenswrapper[4757]: I1216 12:59:30.327751 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct" Dec 16 12:59:35 crc kubenswrapper[4757]: I1216 12:59:35.235082 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-crfw5"] Dec 16 12:59:35 crc kubenswrapper[4757]: E1216 12:59:35.235875 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aeee37c-9135-4d1a-8e52-181ea394acc6" containerName="pull" Dec 16 12:59:35 crc kubenswrapper[4757]: I1216 12:59:35.235892 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aeee37c-9135-4d1a-8e52-181ea394acc6" containerName="pull" Dec 16 12:59:35 crc kubenswrapper[4757]: E1216 12:59:35.235917 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aeee37c-9135-4d1a-8e52-181ea394acc6" containerName="extract" Dec 16 12:59:35 crc kubenswrapper[4757]: I1216 12:59:35.235925 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aeee37c-9135-4d1a-8e52-181ea394acc6" containerName="extract" Dec 16 12:59:35 crc kubenswrapper[4757]: E1216 12:59:35.235942 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aeee37c-9135-4d1a-8e52-181ea394acc6" containerName="util" Dec 16 12:59:35 crc kubenswrapper[4757]: I1216 12:59:35.235950 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aeee37c-9135-4d1a-8e52-181ea394acc6" containerName="util" Dec 16 12:59:35 crc kubenswrapper[4757]: I1216 12:59:35.236087 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aeee37c-9135-4d1a-8e52-181ea394acc6" containerName="extract" Dec 16 12:59:35 crc kubenswrapper[4757]: I1216 12:59:35.236544 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-crfw5" Dec 16 12:59:35 crc kubenswrapper[4757]: I1216 12:59:35.242907 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 16 12:59:35 crc kubenswrapper[4757]: I1216 12:59:35.242946 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 16 12:59:35 crc kubenswrapper[4757]: I1216 12:59:35.243056 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-gfqmv" Dec 16 12:59:35 crc kubenswrapper[4757]: I1216 12:59:35.254552 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-crfw5"] Dec 16 12:59:35 crc kubenswrapper[4757]: I1216 12:59:35.325123 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw9q2\" (UniqueName: \"kubernetes.io/projected/13ed71f2-85e0-4dc6-94c1-82e12982c67f-kube-api-access-bw9q2\") pod \"nmstate-operator-6769fb99d-crfw5\" (UID: \"13ed71f2-85e0-4dc6-94c1-82e12982c67f\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-crfw5" Dec 16 12:59:35 crc kubenswrapper[4757]: I1216 12:59:35.426482 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw9q2\" (UniqueName: \"kubernetes.io/projected/13ed71f2-85e0-4dc6-94c1-82e12982c67f-kube-api-access-bw9q2\") pod \"nmstate-operator-6769fb99d-crfw5\" (UID: \"13ed71f2-85e0-4dc6-94c1-82e12982c67f\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-crfw5" Dec 16 12:59:35 crc kubenswrapper[4757]: I1216 12:59:35.453810 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw9q2\" (UniqueName: \"kubernetes.io/projected/13ed71f2-85e0-4dc6-94c1-82e12982c67f-kube-api-access-bw9q2\") pod \"nmstate-operator-6769fb99d-crfw5\" (UID: \"13ed71f2-85e0-4dc6-94c1-82e12982c67f\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-crfw5" Dec 16 12:59:35 crc kubenswrapper[4757]: I1216 12:59:35.555676 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-crfw5" Dec 16 12:59:35 crc kubenswrapper[4757]: I1216 12:59:35.816352 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-phkjc" Dec 16 12:59:35 crc kubenswrapper[4757]: I1216 12:59:35.816693 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-phkjc" Dec 16 12:59:35 crc kubenswrapper[4757]: I1216 12:59:35.846430 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-crfw5"] Dec 16 12:59:35 crc kubenswrapper[4757]: I1216 12:59:35.872943 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-phkjc" Dec 16 12:59:36 crc kubenswrapper[4757]: I1216 12:59:36.358461 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-crfw5" event={"ID":"13ed71f2-85e0-4dc6-94c1-82e12982c67f","Type":"ContainerStarted","Data":"0b053e9ea8ec482b259e38db94bc075824f66aaa20e51f243b7b31d9d956765c"} Dec 16 12:59:36 crc kubenswrapper[4757]: I1216 12:59:36.404562 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-phkjc" Dec 16 12:59:38 crc kubenswrapper[4757]: I1216 12:59:38.368744 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phkjc"] Dec 16 12:59:38 crc kubenswrapper[4757]: I1216 12:59:38.373180 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-phkjc" podUID="1f037065-1189-4342-8d2c-82093767998a" containerName="registry-server" containerID="cri-o://913c5a6c2fb7201e77c61a04935e31becf1ca2e03ad007183c5937e98e971538" gracePeriod=2 Dec 16 12:59:40 crc kubenswrapper[4757]: I1216 12:59:40.384824 4757 generic.go:334] "Generic (PLEG): container finished" podID="1f037065-1189-4342-8d2c-82093767998a" containerID="913c5a6c2fb7201e77c61a04935e31becf1ca2e03ad007183c5937e98e971538" exitCode=0 Dec 16 12:59:40 crc kubenswrapper[4757]: I1216 12:59:40.384859 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkjc" event={"ID":"1f037065-1189-4342-8d2c-82093767998a","Type":"ContainerDied","Data":"913c5a6c2fb7201e77c61a04935e31becf1ca2e03ad007183c5937e98e971538"} Dec 16 12:59:40 crc kubenswrapper[4757]: I1216 12:59:40.386716 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-crfw5" event={"ID":"13ed71f2-85e0-4dc6-94c1-82e12982c67f","Type":"ContainerStarted","Data":"67078a80ab7edb9808bba75fe785b94fd82f57b8a562f5efecee6a8044d28cd9"} Dec 16 12:59:40 crc kubenswrapper[4757]: I1216 12:59:40.410773 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-6769fb99d-crfw5" podStartSLOduration=2.601442757 podStartE2EDuration="5.410753149s" podCreationTimestamp="2025-12-16 12:59:35 +0000 UTC" firstStartedPulling="2025-12-16 12:59:35.867051784 +0000 UTC m=+761.294795580" lastFinishedPulling="2025-12-16 12:59:38.676362176 +0000 UTC m=+764.104105972" observedRunningTime="2025-12-16 12:59:40.409464226 +0000 UTC m=+765.837208032" watchObservedRunningTime="2025-12-16 12:59:40.410753149 +0000 UTC m=+765.838496945" Dec 16 12:59:41 crc kubenswrapper[4757]: I1216 12:59:41.022609 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phkjc" Dec 16 12:59:41 crc kubenswrapper[4757]: I1216 12:59:41.076698 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f037065-1189-4342-8d2c-82093767998a-catalog-content\") pod \"1f037065-1189-4342-8d2c-82093767998a\" (UID: \"1f037065-1189-4342-8d2c-82093767998a\") " Dec 16 12:59:41 crc kubenswrapper[4757]: I1216 12:59:41.083645 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f037065-1189-4342-8d2c-82093767998a-utilities\") pod \"1f037065-1189-4342-8d2c-82093767998a\" (UID: \"1f037065-1189-4342-8d2c-82093767998a\") " Dec 16 12:59:41 crc kubenswrapper[4757]: I1216 12:59:41.083698 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8wck\" (UniqueName: \"kubernetes.io/projected/1f037065-1189-4342-8d2c-82093767998a-kube-api-access-g8wck\") pod \"1f037065-1189-4342-8d2c-82093767998a\" (UID: \"1f037065-1189-4342-8d2c-82093767998a\") " Dec 16 12:59:41 crc kubenswrapper[4757]: I1216 12:59:41.084758 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f037065-1189-4342-8d2c-82093767998a-utilities" (OuterVolumeSpecName: "utilities") pod "1f037065-1189-4342-8d2c-82093767998a" (UID: "1f037065-1189-4342-8d2c-82093767998a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:59:41 crc kubenswrapper[4757]: I1216 12:59:41.099286 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f037065-1189-4342-8d2c-82093767998a-kube-api-access-g8wck" (OuterVolumeSpecName: "kube-api-access-g8wck") pod "1f037065-1189-4342-8d2c-82093767998a" (UID: "1f037065-1189-4342-8d2c-82093767998a"). InnerVolumeSpecName "kube-api-access-g8wck". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 12:59:41 crc kubenswrapper[4757]: I1216 12:59:41.184868 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f037065-1189-4342-8d2c-82093767998a-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 12:59:41 crc kubenswrapper[4757]: I1216 12:59:41.184927 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8wck\" (UniqueName: \"kubernetes.io/projected/1f037065-1189-4342-8d2c-82093767998a-kube-api-access-g8wck\") on node \"crc\" DevicePath \"\"" Dec 16 12:59:41 crc kubenswrapper[4757]: I1216 12:59:41.211128 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f037065-1189-4342-8d2c-82093767998a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f037065-1189-4342-8d2c-82093767998a" (UID: "1f037065-1189-4342-8d2c-82093767998a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 12:59:41 crc kubenswrapper[4757]: I1216 12:59:41.285941 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f037065-1189-4342-8d2c-82093767998a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 12:59:41 crc kubenswrapper[4757]: I1216 12:59:41.395698 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phkjc" event={"ID":"1f037065-1189-4342-8d2c-82093767998a","Type":"ContainerDied","Data":"c2df94826bc72a48cef4bdafd3fbb442a3ada5e111e1df23220ed4f233bfe93f"} Dec 16 12:59:41 crc kubenswrapper[4757]: I1216 12:59:41.395781 4757 scope.go:117] "RemoveContainer" containerID="913c5a6c2fb7201e77c61a04935e31becf1ca2e03ad007183c5937e98e971538" Dec 16 12:59:41 crc kubenswrapper[4757]: I1216 12:59:41.395723 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phkjc" Dec 16 12:59:41 crc kubenswrapper[4757]: I1216 12:59:41.415465 4757 scope.go:117] "RemoveContainer" containerID="cb00d3628afd3f305e59bed014ab093671787785793a63af4500b77e7f2a81c7" Dec 16 12:59:41 crc kubenswrapper[4757]: I1216 12:59:41.426281 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phkjc"] Dec 16 12:59:41 crc kubenswrapper[4757]: I1216 12:59:41.436494 4757 scope.go:117] "RemoveContainer" containerID="a57bb43abf6cf526d34fdbd57c95c8787612aa521fe8204b16ab5001ba88ed86" Dec 16 12:59:41 crc kubenswrapper[4757]: I1216 12:59:41.436813 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-phkjc"] Dec 16 12:59:42 crc kubenswrapper[4757]: I1216 12:59:42.955491 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f037065-1189-4342-8d2c-82093767998a" path="/var/lib/kubelet/pods/1f037065-1189-4342-8d2c-82093767998a/volumes" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.293478 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-gj6pf"] Dec 16 12:59:44 crc kubenswrapper[4757]: E1216 12:59:44.293997 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f037065-1189-4342-8d2c-82093767998a" containerName="extract-utilities" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.294028 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f037065-1189-4342-8d2c-82093767998a" containerName="extract-utilities" Dec 16 12:59:44 crc kubenswrapper[4757]: E1216 12:59:44.294039 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f037065-1189-4342-8d2c-82093767998a" containerName="registry-server" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.294046 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f037065-1189-4342-8d2c-82093767998a" containerName="registry-server" Dec 16 12:59:44 crc kubenswrapper[4757]: E1216 12:59:44.294061 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f037065-1189-4342-8d2c-82093767998a" containerName="extract-content" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.294068 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f037065-1189-4342-8d2c-82093767998a" containerName="extract-content" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.294179 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f037065-1189-4342-8d2c-82093767998a" containerName="registry-server" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.294755 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-gj6pf" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.302434 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-ckdzq" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.316033 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-gj6pf"] Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.318734 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr4mr\" (UniqueName: \"kubernetes.io/projected/e6a65757-6cab-4b58-a399-d186414d6485-kube-api-access-vr4mr\") pod \"nmstate-metrics-7f7f7578db-gj6pf\" (UID: \"e6a65757-6cab-4b58-a399-d186414d6485\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-gj6pf" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.320547 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-l6pg2"] Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.321335 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-l6pg2" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.334973 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.343443 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-l6pg2"] Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.348408 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-8dspc"] Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.349212 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8dspc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.420280 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7a0c9c35-f5d3-4503-831d-840fdc460911-nmstate-lock\") pod \"nmstate-handler-8dspc\" (UID: \"7a0c9c35-f5d3-4503-831d-840fdc460911\") " pod="openshift-nmstate/nmstate-handler-8dspc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.420368 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjz5d\" (UniqueName: \"kubernetes.io/projected/97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922-kube-api-access-sjz5d\") pod \"nmstate-webhook-f8fb84555-l6pg2\" (UID: \"97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-l6pg2" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.420413 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr4mr\" (UniqueName: \"kubernetes.io/projected/e6a65757-6cab-4b58-a399-d186414d6485-kube-api-access-vr4mr\") pod \"nmstate-metrics-7f7f7578db-gj6pf\" (UID: \"e6a65757-6cab-4b58-a399-d186414d6485\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-gj6pf" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.420443 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-l6pg2\" (UID: \"97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-l6pg2" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.420479 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42jk8\" (UniqueName: \"kubernetes.io/projected/7a0c9c35-f5d3-4503-831d-840fdc460911-kube-api-access-42jk8\") pod \"nmstate-handler-8dspc\" (UID: \"7a0c9c35-f5d3-4503-831d-840fdc460911\") " pod="openshift-nmstate/nmstate-handler-8dspc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.420509 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7a0c9c35-f5d3-4503-831d-840fdc460911-ovs-socket\") pod \"nmstate-handler-8dspc\" (UID: \"7a0c9c35-f5d3-4503-831d-840fdc460911\") " pod="openshift-nmstate/nmstate-handler-8dspc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.420585 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7a0c9c35-f5d3-4503-831d-840fdc460911-dbus-socket\") pod \"nmstate-handler-8dspc\" (UID: \"7a0c9c35-f5d3-4503-831d-840fdc460911\") " pod="openshift-nmstate/nmstate-handler-8dspc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.456765 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr4mr\" (UniqueName: \"kubernetes.io/projected/e6a65757-6cab-4b58-a399-d186414d6485-kube-api-access-vr4mr\") pod \"nmstate-metrics-7f7f7578db-gj6pf\" (UID: \"e6a65757-6cab-4b58-a399-d186414d6485\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-gj6pf" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.464586 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-b9shc"] Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.465271 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b9shc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.468849 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.468926 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-xvpb6" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.469279 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.478590 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-b9shc"] Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.521844 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/536f7375-828d-41f1-afd3-509271873ae2-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-b9shc\" (UID: \"536f7375-828d-41f1-afd3-509271873ae2\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b9shc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.521959 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-l6pg2\" (UID: \"97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-l6pg2" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.522145 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42jk8\" (UniqueName: \"kubernetes.io/projected/7a0c9c35-f5d3-4503-831d-840fdc460911-kube-api-access-42jk8\") pod \"nmstate-handler-8dspc\" (UID: \"7a0c9c35-f5d3-4503-831d-840fdc460911\") " pod="openshift-nmstate/nmstate-handler-8dspc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.522169 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llhn7\" (UniqueName: \"kubernetes.io/projected/536f7375-828d-41f1-afd3-509271873ae2-kube-api-access-llhn7\") pod \"nmstate-console-plugin-6ff7998486-b9shc\" (UID: \"536f7375-828d-41f1-afd3-509271873ae2\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b9shc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.522238 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7a0c9c35-f5d3-4503-831d-840fdc460911-ovs-socket\") pod \"nmstate-handler-8dspc\" (UID: \"7a0c9c35-f5d3-4503-831d-840fdc460911\") " pod="openshift-nmstate/nmstate-handler-8dspc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.522268 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7a0c9c35-f5d3-4503-831d-840fdc460911-dbus-socket\") pod \"nmstate-handler-8dspc\" (UID: \"7a0c9c35-f5d3-4503-831d-840fdc460911\") " pod="openshift-nmstate/nmstate-handler-8dspc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.522312 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/536f7375-828d-41f1-afd3-509271873ae2-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-b9shc\" (UID: \"536f7375-828d-41f1-afd3-509271873ae2\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b9shc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.522334 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7a0c9c35-f5d3-4503-831d-840fdc460911-nmstate-lock\") pod \"nmstate-handler-8dspc\" (UID: \"7a0c9c35-f5d3-4503-831d-840fdc460911\") " pod="openshift-nmstate/nmstate-handler-8dspc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.522376 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjz5d\" (UniqueName: \"kubernetes.io/projected/97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922-kube-api-access-sjz5d\") pod \"nmstate-webhook-f8fb84555-l6pg2\" (UID: \"97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-l6pg2" Dec 16 12:59:44 crc kubenswrapper[4757]: E1216 12:59:44.522558 4757 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 16 12:59:44 crc kubenswrapper[4757]: E1216 12:59:44.522632 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922-tls-key-pair podName:97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922 nodeName:}" failed. No retries permitted until 2025-12-16 12:59:45.022607853 +0000 UTC m=+770.450351649 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922-tls-key-pair") pod "nmstate-webhook-f8fb84555-l6pg2" (UID: "97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922") : secret "openshift-nmstate-webhook" not found Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.522737 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7a0c9c35-f5d3-4503-831d-840fdc460911-ovs-socket\") pod \"nmstate-handler-8dspc\" (UID: \"7a0c9c35-f5d3-4503-831d-840fdc460911\") " pod="openshift-nmstate/nmstate-handler-8dspc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.522958 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7a0c9c35-f5d3-4503-831d-840fdc460911-dbus-socket\") pod \"nmstate-handler-8dspc\" (UID: \"7a0c9c35-f5d3-4503-831d-840fdc460911\") " pod="openshift-nmstate/nmstate-handler-8dspc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.522995 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7a0c9c35-f5d3-4503-831d-840fdc460911-nmstate-lock\") pod \"nmstate-handler-8dspc\" (UID: \"7a0c9c35-f5d3-4503-831d-840fdc460911\") " pod="openshift-nmstate/nmstate-handler-8dspc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.542738 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42jk8\" (UniqueName: \"kubernetes.io/projected/7a0c9c35-f5d3-4503-831d-840fdc460911-kube-api-access-42jk8\") pod \"nmstate-handler-8dspc\" (UID: \"7a0c9c35-f5d3-4503-831d-840fdc460911\") " pod="openshift-nmstate/nmstate-handler-8dspc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.545680 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjz5d\" (UniqueName: \"kubernetes.io/projected/97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922-kube-api-access-sjz5d\") pod \"nmstate-webhook-f8fb84555-l6pg2\" (UID: \"97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-l6pg2" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.615665 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-gj6pf" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.623348 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/536f7375-828d-41f1-afd3-509271873ae2-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-b9shc\" (UID: \"536f7375-828d-41f1-afd3-509271873ae2\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b9shc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.623448 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llhn7\" (UniqueName: \"kubernetes.io/projected/536f7375-828d-41f1-afd3-509271873ae2-kube-api-access-llhn7\") pod \"nmstate-console-plugin-6ff7998486-b9shc\" (UID: \"536f7375-828d-41f1-afd3-509271873ae2\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b9shc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.623504 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/536f7375-828d-41f1-afd3-509271873ae2-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-b9shc\" (UID: \"536f7375-828d-41f1-afd3-509271873ae2\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b9shc" Dec 16 12:59:44 crc kubenswrapper[4757]: E1216 12:59:44.623540 4757 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 16 12:59:44 crc kubenswrapper[4757]: E1216 12:59:44.623600 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/536f7375-828d-41f1-afd3-509271873ae2-plugin-serving-cert podName:536f7375-828d-41f1-afd3-509271873ae2 nodeName:}" failed. No retries permitted until 2025-12-16 12:59:45.123580756 +0000 UTC m=+770.551324562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/536f7375-828d-41f1-afd3-509271873ae2-plugin-serving-cert") pod "nmstate-console-plugin-6ff7998486-b9shc" (UID: "536f7375-828d-41f1-afd3-509271873ae2") : secret "plugin-serving-cert" not found Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.624456 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/536f7375-828d-41f1-afd3-509271873ae2-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-b9shc\" (UID: \"536f7375-828d-41f1-afd3-509271873ae2\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b9shc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.642498 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llhn7\" (UniqueName: \"kubernetes.io/projected/536f7375-828d-41f1-afd3-509271873ae2-kube-api-access-llhn7\") pod \"nmstate-console-plugin-6ff7998486-b9shc\" (UID: \"536f7375-828d-41f1-afd3-509271873ae2\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b9shc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.665603 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8dspc" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.686873 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-86779b9dcc-zs7b9"] Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.687648 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.699230 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86779b9dcc-zs7b9"] Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.724214 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c2585874-746d-4a75-b83c-19156ea86946-service-ca\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.724271 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c2585874-746d-4a75-b83c-19156ea86946-oauth-serving-cert\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.724310 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrm7d\" (UniqueName: \"kubernetes.io/projected/c2585874-746d-4a75-b83c-19156ea86946-kube-api-access-hrm7d\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.724336 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2585874-746d-4a75-b83c-19156ea86946-console-serving-cert\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.724405 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c2585874-746d-4a75-b83c-19156ea86946-console-oauth-config\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.724425 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c2585874-746d-4a75-b83c-19156ea86946-console-config\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.724451 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2585874-746d-4a75-b83c-19156ea86946-trusted-ca-bundle\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.825917 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c2585874-746d-4a75-b83c-19156ea86946-service-ca\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.826321 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c2585874-746d-4a75-b83c-19156ea86946-oauth-serving-cert\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.826358 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrm7d\" (UniqueName: \"kubernetes.io/projected/c2585874-746d-4a75-b83c-19156ea86946-kube-api-access-hrm7d\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.826386 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2585874-746d-4a75-b83c-19156ea86946-console-serving-cert\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.826464 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c2585874-746d-4a75-b83c-19156ea86946-console-oauth-config\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.826489 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c2585874-746d-4a75-b83c-19156ea86946-console-config\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.826515 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2585874-746d-4a75-b83c-19156ea86946-trusted-ca-bundle\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.826954 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c2585874-746d-4a75-b83c-19156ea86946-service-ca\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.827411 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2585874-746d-4a75-b83c-19156ea86946-trusted-ca-bundle\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.828603 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c2585874-746d-4a75-b83c-19156ea86946-console-config\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.828906 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c2585874-746d-4a75-b83c-19156ea86946-oauth-serving-cert\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.834624 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c2585874-746d-4a75-b83c-19156ea86946-console-oauth-config\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.834747 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2585874-746d-4a75-b83c-19156ea86946-console-serving-cert\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.852464 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrm7d\" (UniqueName: \"kubernetes.io/projected/c2585874-746d-4a75-b83c-19156ea86946-kube-api-access-hrm7d\") pod \"console-86779b9dcc-zs7b9\" (UID: \"c2585874-746d-4a75-b83c-19156ea86946\") " pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:44 crc kubenswrapper[4757]: W1216 12:59:44.944618 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6a65757_6cab_4b58_a399_d186414d6485.slice/crio-adcb84f0a41fab5fc2b7bbd1a414e49e34cefdd0721410d4fe672553a54e041d WatchSource:0}: Error finding container adcb84f0a41fab5fc2b7bbd1a414e49e34cefdd0721410d4fe672553a54e041d: Status 404 returned error can't find the container with id adcb84f0a41fab5fc2b7bbd1a414e49e34cefdd0721410d4fe672553a54e041d Dec 16 12:59:44 crc kubenswrapper[4757]: I1216 12:59:44.945785 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-gj6pf"] Dec 16 12:59:45 crc kubenswrapper[4757]: I1216 12:59:45.002508 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:45 crc kubenswrapper[4757]: I1216 12:59:45.029074 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-l6pg2\" (UID: \"97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-l6pg2" Dec 16 12:59:45 crc kubenswrapper[4757]: I1216 12:59:45.032676 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-l6pg2\" (UID: \"97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-l6pg2" Dec 16 12:59:45 crc kubenswrapper[4757]: I1216 12:59:45.131259 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/536f7375-828d-41f1-afd3-509271873ae2-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-b9shc\" (UID: \"536f7375-828d-41f1-afd3-509271873ae2\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b9shc" Dec 16 12:59:45 crc kubenswrapper[4757]: I1216 12:59:45.137112 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/536f7375-828d-41f1-afd3-509271873ae2-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-b9shc\" (UID: \"536f7375-828d-41f1-afd3-509271873ae2\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b9shc" Dec 16 12:59:45 crc kubenswrapper[4757]: I1216 12:59:45.233932 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86779b9dcc-zs7b9"] Dec 16 12:59:45 crc kubenswrapper[4757]: I1216 12:59:45.236280 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-l6pg2" Dec 16 12:59:45 crc kubenswrapper[4757]: W1216 12:59:45.241561 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2585874_746d_4a75_b83c_19156ea86946.slice/crio-517f8b9f82acf084ebc670ebe95eb778117a9d330ddc9ea7bc1bbddc74123da4 WatchSource:0}: Error finding container 517f8b9f82acf084ebc670ebe95eb778117a9d330ddc9ea7bc1bbddc74123da4: Status 404 returned error can't find the container with id 517f8b9f82acf084ebc670ebe95eb778117a9d330ddc9ea7bc1bbddc74123da4 Dec 16 12:59:45 crc kubenswrapper[4757]: I1216 12:59:45.393208 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b9shc" Dec 16 12:59:45 crc kubenswrapper[4757]: I1216 12:59:45.425654 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8dspc" event={"ID":"7a0c9c35-f5d3-4503-831d-840fdc460911","Type":"ContainerStarted","Data":"bb78c1c67dd6bbc1d99df07f8062f697e8870e139f49bed1fa2f84b67bd60150"} Dec 16 12:59:45 crc kubenswrapper[4757]: I1216 12:59:45.433778 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86779b9dcc-zs7b9" event={"ID":"c2585874-746d-4a75-b83c-19156ea86946","Type":"ContainerStarted","Data":"517f8b9f82acf084ebc670ebe95eb778117a9d330ddc9ea7bc1bbddc74123da4"} Dec 16 12:59:45 crc kubenswrapper[4757]: I1216 12:59:45.438572 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-gj6pf" event={"ID":"e6a65757-6cab-4b58-a399-d186414d6485","Type":"ContainerStarted","Data":"adcb84f0a41fab5fc2b7bbd1a414e49e34cefdd0721410d4fe672553a54e041d"} Dec 16 12:59:45 crc kubenswrapper[4757]: I1216 12:59:45.443844 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-l6pg2"] Dec 16 12:59:45 crc kubenswrapper[4757]: W1216 12:59:45.448670 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97bad1b0_8ffc_4e2e_ac9f_d6f5ed747922.slice/crio-f20f717df75ee3d97b39b579014f1083213ba97541ee5afd37bd6de9c54380f9 WatchSource:0}: Error finding container f20f717df75ee3d97b39b579014f1083213ba97541ee5afd37bd6de9c54380f9: Status 404 returned error can't find the container with id f20f717df75ee3d97b39b579014f1083213ba97541ee5afd37bd6de9c54380f9 Dec 16 12:59:45 crc kubenswrapper[4757]: I1216 12:59:45.601824 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-b9shc"] Dec 16 12:59:45 crc kubenswrapper[4757]: W1216 12:59:45.609330 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod536f7375_828d_41f1_afd3_509271873ae2.slice/crio-81abeda6bcc1e739dbb43aa8cbb45ffc7fa4b6d2d412b29f9cdc87aa9c8a34e2 WatchSource:0}: Error finding container 81abeda6bcc1e739dbb43aa8cbb45ffc7fa4b6d2d412b29f9cdc87aa9c8a34e2: Status 404 returned error can't find the container with id 81abeda6bcc1e739dbb43aa8cbb45ffc7fa4b6d2d412b29f9cdc87aa9c8a34e2 Dec 16 12:59:46 crc kubenswrapper[4757]: I1216 12:59:46.446800 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b9shc" event={"ID":"536f7375-828d-41f1-afd3-509271873ae2","Type":"ContainerStarted","Data":"81abeda6bcc1e739dbb43aa8cbb45ffc7fa4b6d2d412b29f9cdc87aa9c8a34e2"} Dec 16 12:59:46 crc kubenswrapper[4757]: I1216 12:59:46.448851 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86779b9dcc-zs7b9" event={"ID":"c2585874-746d-4a75-b83c-19156ea86946","Type":"ContainerStarted","Data":"e6aac9a1b41adb8d5926a4b2abb14b8aba52b3636acf38509d443015152a3999"} Dec 16 12:59:46 crc kubenswrapper[4757]: I1216 12:59:46.451972 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-l6pg2" event={"ID":"97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922","Type":"ContainerStarted","Data":"f20f717df75ee3d97b39b579014f1083213ba97541ee5afd37bd6de9c54380f9"} Dec 16 12:59:49 crc kubenswrapper[4757]: I1216 12:59:49.471453 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-gj6pf" event={"ID":"e6a65757-6cab-4b58-a399-d186414d6485","Type":"ContainerStarted","Data":"24d833408d1ab59bda46c9b264d716edab97764e64042164d64de83d80abe088"} Dec 16 12:59:49 crc kubenswrapper[4757]: I1216 12:59:49.474428 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b9shc" event={"ID":"536f7375-828d-41f1-afd3-509271873ae2","Type":"ContainerStarted","Data":"b239bb4356eb9cc36ff54e291ed33919307ae35545601db724f3667c939d7f0f"} Dec 16 12:59:49 crc kubenswrapper[4757]: I1216 12:59:49.475795 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8dspc" event={"ID":"7a0c9c35-f5d3-4503-831d-840fdc460911","Type":"ContainerStarted","Data":"58bf4dbfb34d9c0f0c7aadf01e4b8eec5fef86d27e3e5851e553ed7f13cd6f79"} Dec 16 12:59:49 crc kubenswrapper[4757]: I1216 12:59:49.476304 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-8dspc" Dec 16 12:59:49 crc kubenswrapper[4757]: I1216 12:59:49.487734 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-b9shc" podStartSLOduration=2.497347621 podStartE2EDuration="5.487720272s" podCreationTimestamp="2025-12-16 12:59:44 +0000 UTC" firstStartedPulling="2025-12-16 12:59:45.612772153 +0000 UTC m=+771.040515949" lastFinishedPulling="2025-12-16 12:59:48.603144804 +0000 UTC m=+774.030888600" observedRunningTime="2025-12-16 12:59:49.486975074 +0000 UTC m=+774.914718870" watchObservedRunningTime="2025-12-16 12:59:49.487720272 +0000 UTC m=+774.915464058" Dec 16 12:59:49 crc kubenswrapper[4757]: I1216 12:59:49.489455 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86779b9dcc-zs7b9" podStartSLOduration=5.489443447 podStartE2EDuration="5.489443447s" podCreationTimestamp="2025-12-16 12:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:59:46.471753203 +0000 UTC m=+771.899497019" watchObservedRunningTime="2025-12-16 12:59:49.489443447 +0000 UTC m=+774.917187243" Dec 16 12:59:49 crc kubenswrapper[4757]: I1216 12:59:49.511487 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-8dspc" podStartSLOduration=1.6292358839999999 podStartE2EDuration="5.511467982s" podCreationTimestamp="2025-12-16 12:59:44 +0000 UTC" firstStartedPulling="2025-12-16 12:59:44.720739152 +0000 UTC m=+770.148482948" lastFinishedPulling="2025-12-16 12:59:48.60297125 +0000 UTC m=+774.030715046" observedRunningTime="2025-12-16 12:59:49.510017855 +0000 UTC m=+774.937761651" watchObservedRunningTime="2025-12-16 12:59:49.511467982 +0000 UTC m=+774.939211778" Dec 16 12:59:52 crc kubenswrapper[4757]: I1216 12:59:52.495567 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-gj6pf" event={"ID":"e6a65757-6cab-4b58-a399-d186414d6485","Type":"ContainerStarted","Data":"c9d824a6415a9f884ffe188b8531a02e7b8adb5e9ac3926780397d812d6abca2"} Dec 16 12:59:52 crc kubenswrapper[4757]: I1216 12:59:52.498969 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-l6pg2" event={"ID":"97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922","Type":"ContainerStarted","Data":"ab236603aa8d81558302f4a070e99782ca14aacd4b24e811719ea87ae673ef99"} Dec 16 12:59:52 crc kubenswrapper[4757]: I1216 12:59:52.499385 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-f8fb84555-l6pg2" Dec 16 12:59:52 crc kubenswrapper[4757]: I1216 12:59:52.548596 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-gj6pf" podStartSLOduration=1.922002894 podStartE2EDuration="8.548577495s" podCreationTimestamp="2025-12-16 12:59:44 +0000 UTC" firstStartedPulling="2025-12-16 12:59:44.947552647 +0000 UTC m=+770.375296443" lastFinishedPulling="2025-12-16 12:59:51.574127248 +0000 UTC m=+777.001871044" observedRunningTime="2025-12-16 12:59:52.522928336 +0000 UTC m=+777.950672142" watchObservedRunningTime="2025-12-16 12:59:52.548577495 +0000 UTC m=+777.976321291" Dec 16 12:59:54 crc kubenswrapper[4757]: I1216 12:59:54.690790 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-8dspc" Dec 16 12:59:54 crc kubenswrapper[4757]: I1216 12:59:54.709317 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-f8fb84555-l6pg2" podStartSLOduration=4.584754052 podStartE2EDuration="10.709297099s" podCreationTimestamp="2025-12-16 12:59:44 +0000 UTC" firstStartedPulling="2025-12-16 12:59:45.451946732 +0000 UTC m=+770.879690538" lastFinishedPulling="2025-12-16 12:59:51.576489789 +0000 UTC m=+777.004233585" observedRunningTime="2025-12-16 12:59:52.548228596 +0000 UTC m=+777.975972392" watchObservedRunningTime="2025-12-16 12:59:54.709297099 +0000 UTC m=+780.137040895" Dec 16 12:59:55 crc kubenswrapper[4757]: I1216 12:59:55.003403 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:55 crc kubenswrapper[4757]: I1216 12:59:55.003792 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:55 crc kubenswrapper[4757]: I1216 12:59:55.010941 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:55 crc kubenswrapper[4757]: I1216 12:59:55.521334 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86779b9dcc-zs7b9" Dec 16 12:59:55 crc kubenswrapper[4757]: I1216 12:59:55.583720 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-zlc9d"] Dec 16 13:00:00 crc kubenswrapper[4757]: I1216 13:00:00.146107 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj"] Dec 16 13:00:00 crc kubenswrapper[4757]: I1216 13:00:00.147539 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj" Dec 16 13:00:00 crc kubenswrapper[4757]: I1216 13:00:00.150809 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 13:00:00 crc kubenswrapper[4757]: I1216 13:00:00.150849 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 13:00:00 crc kubenswrapper[4757]: I1216 13:00:00.157923 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj"] Dec 16 13:00:00 crc kubenswrapper[4757]: I1216 13:00:00.328804 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04c39a6c-e3d7-411a-bc03-f65ff0fe6167-secret-volume\") pod \"collect-profiles-29431500-m5mqj\" (UID: \"04c39a6c-e3d7-411a-bc03-f65ff0fe6167\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj" Dec 16 13:00:00 crc kubenswrapper[4757]: I1216 13:00:00.328877 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04c39a6c-e3d7-411a-bc03-f65ff0fe6167-config-volume\") pod \"collect-profiles-29431500-m5mqj\" (UID: \"04c39a6c-e3d7-411a-bc03-f65ff0fe6167\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj" Dec 16 13:00:00 crc kubenswrapper[4757]: I1216 13:00:00.328907 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndrf6\" (UniqueName: \"kubernetes.io/projected/04c39a6c-e3d7-411a-bc03-f65ff0fe6167-kube-api-access-ndrf6\") pod \"collect-profiles-29431500-m5mqj\" (UID: \"04c39a6c-e3d7-411a-bc03-f65ff0fe6167\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj" Dec 16 13:00:00 crc kubenswrapper[4757]: I1216 13:00:00.430205 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04c39a6c-e3d7-411a-bc03-f65ff0fe6167-secret-volume\") pod \"collect-profiles-29431500-m5mqj\" (UID: \"04c39a6c-e3d7-411a-bc03-f65ff0fe6167\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj" Dec 16 13:00:00 crc kubenswrapper[4757]: I1216 13:00:00.430264 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04c39a6c-e3d7-411a-bc03-f65ff0fe6167-config-volume\") pod \"collect-profiles-29431500-m5mqj\" (UID: \"04c39a6c-e3d7-411a-bc03-f65ff0fe6167\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj" Dec 16 13:00:00 crc kubenswrapper[4757]: I1216 13:00:00.430297 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndrf6\" (UniqueName: \"kubernetes.io/projected/04c39a6c-e3d7-411a-bc03-f65ff0fe6167-kube-api-access-ndrf6\") pod \"collect-profiles-29431500-m5mqj\" (UID: \"04c39a6c-e3d7-411a-bc03-f65ff0fe6167\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj" Dec 16 13:00:00 crc kubenswrapper[4757]: I1216 13:00:00.431864 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04c39a6c-e3d7-411a-bc03-f65ff0fe6167-config-volume\") pod \"collect-profiles-29431500-m5mqj\" (UID: \"04c39a6c-e3d7-411a-bc03-f65ff0fe6167\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj" Dec 16 13:00:00 crc kubenswrapper[4757]: I1216 13:00:00.441580 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04c39a6c-e3d7-411a-bc03-f65ff0fe6167-secret-volume\") pod \"collect-profiles-29431500-m5mqj\" (UID: \"04c39a6c-e3d7-411a-bc03-f65ff0fe6167\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj" Dec 16 13:00:00 crc kubenswrapper[4757]: I1216 13:00:00.457852 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndrf6\" (UniqueName: \"kubernetes.io/projected/04c39a6c-e3d7-411a-bc03-f65ff0fe6167-kube-api-access-ndrf6\") pod \"collect-profiles-29431500-m5mqj\" (UID: \"04c39a6c-e3d7-411a-bc03-f65ff0fe6167\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj" Dec 16 13:00:00 crc kubenswrapper[4757]: I1216 13:00:00.518692 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj" Dec 16 13:00:00 crc kubenswrapper[4757]: I1216 13:00:00.701243 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj"] Dec 16 13:00:00 crc kubenswrapper[4757]: W1216 13:00:00.709216 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04c39a6c_e3d7_411a_bc03_f65ff0fe6167.slice/crio-caddf5fc695db3201c0d2b0472a19a36c9d9b55075d6bb4750f6c10531587c02 WatchSource:0}: Error finding container caddf5fc695db3201c0d2b0472a19a36c9d9b55075d6bb4750f6c10531587c02: Status 404 returned error can't find the container with id caddf5fc695db3201c0d2b0472a19a36c9d9b55075d6bb4750f6c10531587c02 Dec 16 13:00:01 crc kubenswrapper[4757]: I1216 13:00:01.550683 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj" event={"ID":"04c39a6c-e3d7-411a-bc03-f65ff0fe6167","Type":"ContainerStarted","Data":"c768050d1fed98fc8472ee568d189da5c7c0a1d5cd0b8013b47b7399df75f307"} Dec 16 13:00:01 crc kubenswrapper[4757]: I1216 13:00:01.551054 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj" event={"ID":"04c39a6c-e3d7-411a-bc03-f65ff0fe6167","Type":"ContainerStarted","Data":"caddf5fc695db3201c0d2b0472a19a36c9d9b55075d6bb4750f6c10531587c02"} Dec 16 13:00:02 crc kubenswrapper[4757]: I1216 13:00:02.562447 4757 generic.go:334] "Generic (PLEG): container finished" podID="04c39a6c-e3d7-411a-bc03-f65ff0fe6167" containerID="c768050d1fed98fc8472ee568d189da5c7c0a1d5cd0b8013b47b7399df75f307" exitCode=0 Dec 16 13:00:02 crc kubenswrapper[4757]: I1216 13:00:02.562519 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj" event={"ID":"04c39a6c-e3d7-411a-bc03-f65ff0fe6167","Type":"ContainerDied","Data":"c768050d1fed98fc8472ee568d189da5c7c0a1d5cd0b8013b47b7399df75f307"} Dec 16 13:00:03 crc kubenswrapper[4757]: I1216 13:00:03.806342 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj" Dec 16 13:00:03 crc kubenswrapper[4757]: I1216 13:00:03.973391 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04c39a6c-e3d7-411a-bc03-f65ff0fe6167-config-volume\") pod \"04c39a6c-e3d7-411a-bc03-f65ff0fe6167\" (UID: \"04c39a6c-e3d7-411a-bc03-f65ff0fe6167\") " Dec 16 13:00:03 crc kubenswrapper[4757]: I1216 13:00:03.973500 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04c39a6c-e3d7-411a-bc03-f65ff0fe6167-secret-volume\") pod \"04c39a6c-e3d7-411a-bc03-f65ff0fe6167\" (UID: \"04c39a6c-e3d7-411a-bc03-f65ff0fe6167\") " Dec 16 13:00:03 crc kubenswrapper[4757]: I1216 13:00:03.974319 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndrf6\" (UniqueName: \"kubernetes.io/projected/04c39a6c-e3d7-411a-bc03-f65ff0fe6167-kube-api-access-ndrf6\") pod \"04c39a6c-e3d7-411a-bc03-f65ff0fe6167\" (UID: \"04c39a6c-e3d7-411a-bc03-f65ff0fe6167\") " Dec 16 13:00:03 crc kubenswrapper[4757]: I1216 13:00:03.974378 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04c39a6c-e3d7-411a-bc03-f65ff0fe6167-config-volume" (OuterVolumeSpecName: "config-volume") pod "04c39a6c-e3d7-411a-bc03-f65ff0fe6167" (UID: "04c39a6c-e3d7-411a-bc03-f65ff0fe6167"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:00:03 crc kubenswrapper[4757]: I1216 13:00:03.974614 4757 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04c39a6c-e3d7-411a-bc03-f65ff0fe6167-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 13:00:03 crc kubenswrapper[4757]: I1216 13:00:03.980874 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04c39a6c-e3d7-411a-bc03-f65ff0fe6167-kube-api-access-ndrf6" (OuterVolumeSpecName: "kube-api-access-ndrf6") pod "04c39a6c-e3d7-411a-bc03-f65ff0fe6167" (UID: "04c39a6c-e3d7-411a-bc03-f65ff0fe6167"). InnerVolumeSpecName "kube-api-access-ndrf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:00:03 crc kubenswrapper[4757]: I1216 13:00:03.981214 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04c39a6c-e3d7-411a-bc03-f65ff0fe6167-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "04c39a6c-e3d7-411a-bc03-f65ff0fe6167" (UID: "04c39a6c-e3d7-411a-bc03-f65ff0fe6167"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:00:04 crc kubenswrapper[4757]: I1216 13:00:04.075181 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndrf6\" (UniqueName: \"kubernetes.io/projected/04c39a6c-e3d7-411a-bc03-f65ff0fe6167-kube-api-access-ndrf6\") on node \"crc\" DevicePath \"\"" Dec 16 13:00:04 crc kubenswrapper[4757]: I1216 13:00:04.075245 4757 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04c39a6c-e3d7-411a-bc03-f65ff0fe6167-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 13:00:04 crc kubenswrapper[4757]: I1216 13:00:04.574148 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj" event={"ID":"04c39a6c-e3d7-411a-bc03-f65ff0fe6167","Type":"ContainerDied","Data":"caddf5fc695db3201c0d2b0472a19a36c9d9b55075d6bb4750f6c10531587c02"} Dec 16 13:00:04 crc kubenswrapper[4757]: I1216 13:00:04.574193 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caddf5fc695db3201c0d2b0472a19a36c9d9b55075d6bb4750f6c10531587c02" Dec 16 13:00:04 crc kubenswrapper[4757]: I1216 13:00:04.574219 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj" Dec 16 13:00:05 crc kubenswrapper[4757]: I1216 13:00:05.243601 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-f8fb84555-l6pg2" Dec 16 13:00:18 crc kubenswrapper[4757]: I1216 13:00:18.602515 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx"] Dec 16 13:00:18 crc kubenswrapper[4757]: E1216 13:00:18.603367 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04c39a6c-e3d7-411a-bc03-f65ff0fe6167" containerName="collect-profiles" Dec 16 13:00:18 crc kubenswrapper[4757]: I1216 13:00:18.603382 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="04c39a6c-e3d7-411a-bc03-f65ff0fe6167" containerName="collect-profiles" Dec 16 13:00:18 crc kubenswrapper[4757]: I1216 13:00:18.603519 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="04c39a6c-e3d7-411a-bc03-f65ff0fe6167" containerName="collect-profiles" Dec 16 13:00:18 crc kubenswrapper[4757]: I1216 13:00:18.604423 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx" Dec 16 13:00:18 crc kubenswrapper[4757]: I1216 13:00:18.607135 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 16 13:00:18 crc kubenswrapper[4757]: I1216 13:00:18.611701 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx"] Dec 16 13:00:18 crc kubenswrapper[4757]: I1216 13:00:18.694383 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwsdf\" (UniqueName: \"kubernetes.io/projected/6bdd3557-11e0-4bc6-b282-80d9d44ecac4-kube-api-access-rwsdf\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx\" (UID: \"6bdd3557-11e0-4bc6-b282-80d9d44ecac4\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx" Dec 16 13:00:18 crc kubenswrapper[4757]: I1216 13:00:18.694480 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6bdd3557-11e0-4bc6-b282-80d9d44ecac4-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx\" (UID: \"6bdd3557-11e0-4bc6-b282-80d9d44ecac4\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx" Dec 16 13:00:18 crc kubenswrapper[4757]: I1216 13:00:18.694582 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6bdd3557-11e0-4bc6-b282-80d9d44ecac4-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx\" (UID: \"6bdd3557-11e0-4bc6-b282-80d9d44ecac4\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx" Dec 16 13:00:18 crc kubenswrapper[4757]: I1216 13:00:18.795879 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwsdf\" (UniqueName: \"kubernetes.io/projected/6bdd3557-11e0-4bc6-b282-80d9d44ecac4-kube-api-access-rwsdf\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx\" (UID: \"6bdd3557-11e0-4bc6-b282-80d9d44ecac4\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx" Dec 16 13:00:18 crc kubenswrapper[4757]: I1216 13:00:18.795974 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6bdd3557-11e0-4bc6-b282-80d9d44ecac4-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx\" (UID: \"6bdd3557-11e0-4bc6-b282-80d9d44ecac4\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx" Dec 16 13:00:18 crc kubenswrapper[4757]: I1216 13:00:18.796086 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6bdd3557-11e0-4bc6-b282-80d9d44ecac4-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx\" (UID: \"6bdd3557-11e0-4bc6-b282-80d9d44ecac4\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx" Dec 16 13:00:18 crc kubenswrapper[4757]: I1216 13:00:18.797090 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6bdd3557-11e0-4bc6-b282-80d9d44ecac4-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx\" (UID: \"6bdd3557-11e0-4bc6-b282-80d9d44ecac4\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx" Dec 16 13:00:18 crc kubenswrapper[4757]: I1216 13:00:18.797103 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6bdd3557-11e0-4bc6-b282-80d9d44ecac4-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx\" (UID: \"6bdd3557-11e0-4bc6-b282-80d9d44ecac4\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx" Dec 16 13:00:18 crc kubenswrapper[4757]: I1216 13:00:18.824288 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwsdf\" (UniqueName: \"kubernetes.io/projected/6bdd3557-11e0-4bc6-b282-80d9d44ecac4-kube-api-access-rwsdf\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx\" (UID: \"6bdd3557-11e0-4bc6-b282-80d9d44ecac4\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx" Dec 16 13:00:18 crc kubenswrapper[4757]: I1216 13:00:18.923933 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx" Dec 16 13:00:19 crc kubenswrapper[4757]: I1216 13:00:19.356054 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx"] Dec 16 13:00:19 crc kubenswrapper[4757]: I1216 13:00:19.662287 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx" event={"ID":"6bdd3557-11e0-4bc6-b282-80d9d44ecac4","Type":"ContainerStarted","Data":"eed0e21b6fcef66cc91a303bd0a268747557b90970ebb1b21c0222f31faab663"} Dec 16 13:00:20 crc kubenswrapper[4757]: I1216 13:00:20.624398 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-zlc9d" podUID="99d216d0-ff66-4cbe-a5ac-c2ca6a41f920" containerName="console" containerID="cri-o://89d541bf51e9c4e65e3b0b1196193903b64d10b35405c4db6549ad2f90508059" gracePeriod=15 Dec 16 13:00:20 crc kubenswrapper[4757]: I1216 13:00:20.669115 4757 generic.go:334] "Generic (PLEG): container finished" podID="6bdd3557-11e0-4bc6-b282-80d9d44ecac4" containerID="bb81cdd411905a904c50ded37bf29750333957e098877abf28069bd7177075cb" exitCode=0 Dec 16 13:00:20 crc kubenswrapper[4757]: I1216 13:00:20.669167 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx" event={"ID":"6bdd3557-11e0-4bc6-b282-80d9d44ecac4","Type":"ContainerDied","Data":"bb81cdd411905a904c50ded37bf29750333957e098877abf28069bd7177075cb"} Dec 16 13:00:20 crc kubenswrapper[4757]: I1216 13:00:20.947786 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-zlc9d_99d216d0-ff66-4cbe-a5ac-c2ca6a41f920/console/0.log" Dec 16 13:00:20 crc kubenswrapper[4757]: I1216 13:00:20.948162 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.123262 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-console-serving-cert\") pod \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.123909 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-console-config\") pod \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.124946 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-console-config" (OuterVolumeSpecName: "console-config") pod "99d216d0-ff66-4cbe-a5ac-c2ca6a41f920" (UID: "99d216d0-ff66-4cbe-a5ac-c2ca6a41f920"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.125300 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-trusted-ca-bundle\") pod \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.125366 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-service-ca\") pod \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.125410 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-console-oauth-config\") pod \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.125460 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-oauth-serving-cert\") pod \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.125513 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwnb4\" (UniqueName: \"kubernetes.io/projected/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-kube-api-access-vwnb4\") pod \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\" (UID: \"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920\") " Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.127574 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-service-ca" (OuterVolumeSpecName: "service-ca") pod "99d216d0-ff66-4cbe-a5ac-c2ca6a41f920" (UID: "99d216d0-ff66-4cbe-a5ac-c2ca6a41f920"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.128596 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "99d216d0-ff66-4cbe-a5ac-c2ca6a41f920" (UID: "99d216d0-ff66-4cbe-a5ac-c2ca6a41f920"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.128913 4757 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-console-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.128954 4757 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.128967 4757 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.132834 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-kube-api-access-vwnb4" (OuterVolumeSpecName: "kube-api-access-vwnb4") pod "99d216d0-ff66-4cbe-a5ac-c2ca6a41f920" (UID: "99d216d0-ff66-4cbe-a5ac-c2ca6a41f920"). InnerVolumeSpecName "kube-api-access-vwnb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.138099 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "99d216d0-ff66-4cbe-a5ac-c2ca6a41f920" (UID: "99d216d0-ff66-4cbe-a5ac-c2ca6a41f920"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.142374 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "99d216d0-ff66-4cbe-a5ac-c2ca6a41f920" (UID: "99d216d0-ff66-4cbe-a5ac-c2ca6a41f920"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.142512 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "99d216d0-ff66-4cbe-a5ac-c2ca6a41f920" (UID: "99d216d0-ff66-4cbe-a5ac-c2ca6a41f920"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.230265 4757 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.230298 4757 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.230307 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwnb4\" (UniqueName: \"kubernetes.io/projected/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-kube-api-access-vwnb4\") on node \"crc\" DevicePath \"\"" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.230317 4757 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.676907 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-zlc9d_99d216d0-ff66-4cbe-a5ac-c2ca6a41f920/console/0.log" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.676966 4757 generic.go:334] "Generic (PLEG): container finished" podID="99d216d0-ff66-4cbe-a5ac-c2ca6a41f920" containerID="89d541bf51e9c4e65e3b0b1196193903b64d10b35405c4db6549ad2f90508059" exitCode=2 Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.676999 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zlc9d" event={"ID":"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920","Type":"ContainerDied","Data":"89d541bf51e9c4e65e3b0b1196193903b64d10b35405c4db6549ad2f90508059"} Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.677054 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zlc9d" event={"ID":"99d216d0-ff66-4cbe-a5ac-c2ca6a41f920","Type":"ContainerDied","Data":"6b0bf25738630c3668a939dd083a58e52c93375512cee087d0272e81010485a0"} Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.677073 4757 scope.go:117] "RemoveContainer" containerID="89d541bf51e9c4e65e3b0b1196193903b64d10b35405c4db6549ad2f90508059" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.677200 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zlc9d" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.694461 4757 scope.go:117] "RemoveContainer" containerID="89d541bf51e9c4e65e3b0b1196193903b64d10b35405c4db6549ad2f90508059" Dec 16 13:00:21 crc kubenswrapper[4757]: E1216 13:00:21.694881 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89d541bf51e9c4e65e3b0b1196193903b64d10b35405c4db6549ad2f90508059\": container with ID starting with 89d541bf51e9c4e65e3b0b1196193903b64d10b35405c4db6549ad2f90508059 not found: ID does not exist" containerID="89d541bf51e9c4e65e3b0b1196193903b64d10b35405c4db6549ad2f90508059" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.694940 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89d541bf51e9c4e65e3b0b1196193903b64d10b35405c4db6549ad2f90508059"} err="failed to get container status \"89d541bf51e9c4e65e3b0b1196193903b64d10b35405c4db6549ad2f90508059\": rpc error: code = NotFound desc = could not find container \"89d541bf51e9c4e65e3b0b1196193903b64d10b35405c4db6549ad2f90508059\": container with ID starting with 89d541bf51e9c4e65e3b0b1196193903b64d10b35405c4db6549ad2f90508059 not found: ID does not exist" Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.710264 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-zlc9d"] Dec 16 13:00:21 crc kubenswrapper[4757]: I1216 13:00:21.713586 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-zlc9d"] Dec 16 13:00:22 crc kubenswrapper[4757]: I1216 13:00:22.685764 4757 generic.go:334] "Generic (PLEG): container finished" podID="6bdd3557-11e0-4bc6-b282-80d9d44ecac4" containerID="e286d02c27fd91ffd84289b8af07a309b9f0cf99057441dba8c4e5cdf5bab199" exitCode=0 Dec 16 13:00:22 crc kubenswrapper[4757]: I1216 13:00:22.685805 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx" event={"ID":"6bdd3557-11e0-4bc6-b282-80d9d44ecac4","Type":"ContainerDied","Data":"e286d02c27fd91ffd84289b8af07a309b9f0cf99057441dba8c4e5cdf5bab199"} Dec 16 13:00:22 crc kubenswrapper[4757]: I1216 13:00:22.957238 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99d216d0-ff66-4cbe-a5ac-c2ca6a41f920" path="/var/lib/kubelet/pods/99d216d0-ff66-4cbe-a5ac-c2ca6a41f920/volumes" Dec 16 13:00:23 crc kubenswrapper[4757]: I1216 13:00:23.692746 4757 generic.go:334] "Generic (PLEG): container finished" podID="6bdd3557-11e0-4bc6-b282-80d9d44ecac4" containerID="c352a5513f551591eaff80f9d7035da2cccd970ab590e6ab1ae0074e66acef32" exitCode=0 Dec 16 13:00:23 crc kubenswrapper[4757]: I1216 13:00:23.692786 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx" event={"ID":"6bdd3557-11e0-4bc6-b282-80d9d44ecac4","Type":"ContainerDied","Data":"c352a5513f551591eaff80f9d7035da2cccd970ab590e6ab1ae0074e66acef32"} Dec 16 13:00:24 crc kubenswrapper[4757]: I1216 13:00:24.906325 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx" Dec 16 13:00:25 crc kubenswrapper[4757]: I1216 13:00:25.075340 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6bdd3557-11e0-4bc6-b282-80d9d44ecac4-bundle\") pod \"6bdd3557-11e0-4bc6-b282-80d9d44ecac4\" (UID: \"6bdd3557-11e0-4bc6-b282-80d9d44ecac4\") " Dec 16 13:00:25 crc kubenswrapper[4757]: I1216 13:00:25.075404 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwsdf\" (UniqueName: \"kubernetes.io/projected/6bdd3557-11e0-4bc6-b282-80d9d44ecac4-kube-api-access-rwsdf\") pod \"6bdd3557-11e0-4bc6-b282-80d9d44ecac4\" (UID: \"6bdd3557-11e0-4bc6-b282-80d9d44ecac4\") " Dec 16 13:00:25 crc kubenswrapper[4757]: I1216 13:00:25.075478 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6bdd3557-11e0-4bc6-b282-80d9d44ecac4-util\") pod \"6bdd3557-11e0-4bc6-b282-80d9d44ecac4\" (UID: \"6bdd3557-11e0-4bc6-b282-80d9d44ecac4\") " Dec 16 13:00:25 crc kubenswrapper[4757]: I1216 13:00:25.076341 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bdd3557-11e0-4bc6-b282-80d9d44ecac4-bundle" (OuterVolumeSpecName: "bundle") pod "6bdd3557-11e0-4bc6-b282-80d9d44ecac4" (UID: "6bdd3557-11e0-4bc6-b282-80d9d44ecac4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:00:25 crc kubenswrapper[4757]: I1216 13:00:25.080581 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bdd3557-11e0-4bc6-b282-80d9d44ecac4-kube-api-access-rwsdf" (OuterVolumeSpecName: "kube-api-access-rwsdf") pod "6bdd3557-11e0-4bc6-b282-80d9d44ecac4" (UID: "6bdd3557-11e0-4bc6-b282-80d9d44ecac4"). InnerVolumeSpecName "kube-api-access-rwsdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:00:25 crc kubenswrapper[4757]: I1216 13:00:25.091586 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bdd3557-11e0-4bc6-b282-80d9d44ecac4-util" (OuterVolumeSpecName: "util") pod "6bdd3557-11e0-4bc6-b282-80d9d44ecac4" (UID: "6bdd3557-11e0-4bc6-b282-80d9d44ecac4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:00:25 crc kubenswrapper[4757]: I1216 13:00:25.176547 4757 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6bdd3557-11e0-4bc6-b282-80d9d44ecac4-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:00:25 crc kubenswrapper[4757]: I1216 13:00:25.176596 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwsdf\" (UniqueName: \"kubernetes.io/projected/6bdd3557-11e0-4bc6-b282-80d9d44ecac4-kube-api-access-rwsdf\") on node \"crc\" DevicePath \"\"" Dec 16 13:00:25 crc kubenswrapper[4757]: I1216 13:00:25.176612 4757 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6bdd3557-11e0-4bc6-b282-80d9d44ecac4-util\") on node \"crc\" DevicePath \"\"" Dec 16 13:00:25 crc kubenswrapper[4757]: I1216 13:00:25.706369 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx" event={"ID":"6bdd3557-11e0-4bc6-b282-80d9d44ecac4","Type":"ContainerDied","Data":"eed0e21b6fcef66cc91a303bd0a268747557b90970ebb1b21c0222f31faab663"} Dec 16 13:00:25 crc kubenswrapper[4757]: I1216 13:00:25.706400 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx" Dec 16 13:00:25 crc kubenswrapper[4757]: I1216 13:00:25.706413 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eed0e21b6fcef66cc91a303bd0a268747557b90970ebb1b21c0222f31faab663" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.805037 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp"] Dec 16 13:00:33 crc kubenswrapper[4757]: E1216 13:00:33.805817 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bdd3557-11e0-4bc6-b282-80d9d44ecac4" containerName="pull" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.805832 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bdd3557-11e0-4bc6-b282-80d9d44ecac4" containerName="pull" Dec 16 13:00:33 crc kubenswrapper[4757]: E1216 13:00:33.805847 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d216d0-ff66-4cbe-a5ac-c2ca6a41f920" containerName="console" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.805855 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d216d0-ff66-4cbe-a5ac-c2ca6a41f920" containerName="console" Dec 16 13:00:33 crc kubenswrapper[4757]: E1216 13:00:33.805865 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bdd3557-11e0-4bc6-b282-80d9d44ecac4" containerName="extract" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.805872 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bdd3557-11e0-4bc6-b282-80d9d44ecac4" containerName="extract" Dec 16 13:00:33 crc kubenswrapper[4757]: E1216 13:00:33.805881 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bdd3557-11e0-4bc6-b282-80d9d44ecac4" containerName="util" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.805888 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bdd3557-11e0-4bc6-b282-80d9d44ecac4" containerName="util" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.806029 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bdd3557-11e0-4bc6-b282-80d9d44ecac4" containerName="extract" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.806042 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d216d0-ff66-4cbe-a5ac-c2ca6a41f920" containerName="console" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.806511 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.814758 4757 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.817119 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.817581 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.818047 4757 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-qgvxj" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.819344 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/019f84e1-6fee-4829-a087-c756c955060a-apiservice-cert\") pod \"metallb-operator-controller-manager-84c894d85c-x59lp\" (UID: \"019f84e1-6fee-4829-a087-c756c955060a\") " pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.819389 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/019f84e1-6fee-4829-a087-c756c955060a-webhook-cert\") pod \"metallb-operator-controller-manager-84c894d85c-x59lp\" (UID: \"019f84e1-6fee-4829-a087-c756c955060a\") " pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.819436 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl7xn\" (UniqueName: \"kubernetes.io/projected/019f84e1-6fee-4829-a087-c756c955060a-kube-api-access-hl7xn\") pod \"metallb-operator-controller-manager-84c894d85c-x59lp\" (UID: \"019f84e1-6fee-4829-a087-c756c955060a\") " pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.821994 4757 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.840111 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp"] Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.920404 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/019f84e1-6fee-4829-a087-c756c955060a-apiservice-cert\") pod \"metallb-operator-controller-manager-84c894d85c-x59lp\" (UID: \"019f84e1-6fee-4829-a087-c756c955060a\") " pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.920458 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/019f84e1-6fee-4829-a087-c756c955060a-webhook-cert\") pod \"metallb-operator-controller-manager-84c894d85c-x59lp\" (UID: \"019f84e1-6fee-4829-a087-c756c955060a\") " pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.920507 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl7xn\" (UniqueName: \"kubernetes.io/projected/019f84e1-6fee-4829-a087-c756c955060a-kube-api-access-hl7xn\") pod \"metallb-operator-controller-manager-84c894d85c-x59lp\" (UID: \"019f84e1-6fee-4829-a087-c756c955060a\") " pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.925727 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/019f84e1-6fee-4829-a087-c756c955060a-webhook-cert\") pod \"metallb-operator-controller-manager-84c894d85c-x59lp\" (UID: \"019f84e1-6fee-4829-a087-c756c955060a\") " pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.938360 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/019f84e1-6fee-4829-a087-c756c955060a-apiservice-cert\") pod \"metallb-operator-controller-manager-84c894d85c-x59lp\" (UID: \"019f84e1-6fee-4829-a087-c756c955060a\") " pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" Dec 16 13:00:33 crc kubenswrapper[4757]: I1216 13:00:33.942327 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl7xn\" (UniqueName: \"kubernetes.io/projected/019f84e1-6fee-4829-a087-c756c955060a-kube-api-access-hl7xn\") pod \"metallb-operator-controller-manager-84c894d85c-x59lp\" (UID: \"019f84e1-6fee-4829-a087-c756c955060a\") " pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.056226 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8"] Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.056978 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8" Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.059529 4757 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.060091 4757 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-l8cs8" Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.060401 4757 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.076303 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8"] Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.122989 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/93d50c8f-84bf-4f97-96ab-98cbbd370476-apiservice-cert\") pod \"metallb-operator-webhook-server-7b74cd5c78-9s4s8\" (UID: \"93d50c8f-84bf-4f97-96ab-98cbbd370476\") " pod="metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8" Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.123125 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/93d50c8f-84bf-4f97-96ab-98cbbd370476-webhook-cert\") pod \"metallb-operator-webhook-server-7b74cd5c78-9s4s8\" (UID: \"93d50c8f-84bf-4f97-96ab-98cbbd370476\") " pod="metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8" Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.123159 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpw6f\" (UniqueName: \"kubernetes.io/projected/93d50c8f-84bf-4f97-96ab-98cbbd370476-kube-api-access-tpw6f\") pod \"metallb-operator-webhook-server-7b74cd5c78-9s4s8\" (UID: \"93d50c8f-84bf-4f97-96ab-98cbbd370476\") " pod="metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8" Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.124462 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.223799 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpw6f\" (UniqueName: \"kubernetes.io/projected/93d50c8f-84bf-4f97-96ab-98cbbd370476-kube-api-access-tpw6f\") pod \"metallb-operator-webhook-server-7b74cd5c78-9s4s8\" (UID: \"93d50c8f-84bf-4f97-96ab-98cbbd370476\") " pod="metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8" Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.224483 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/93d50c8f-84bf-4f97-96ab-98cbbd370476-apiservice-cert\") pod \"metallb-operator-webhook-server-7b74cd5c78-9s4s8\" (UID: \"93d50c8f-84bf-4f97-96ab-98cbbd370476\") " pod="metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8" Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.225135 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/93d50c8f-84bf-4f97-96ab-98cbbd370476-webhook-cert\") pod \"metallb-operator-webhook-server-7b74cd5c78-9s4s8\" (UID: \"93d50c8f-84bf-4f97-96ab-98cbbd370476\") " pod="metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8" Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.229031 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/93d50c8f-84bf-4f97-96ab-98cbbd370476-apiservice-cert\") pod \"metallb-operator-webhook-server-7b74cd5c78-9s4s8\" (UID: \"93d50c8f-84bf-4f97-96ab-98cbbd370476\") " pod="metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8" Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.229191 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/93d50c8f-84bf-4f97-96ab-98cbbd370476-webhook-cert\") pod \"metallb-operator-webhook-server-7b74cd5c78-9s4s8\" (UID: \"93d50c8f-84bf-4f97-96ab-98cbbd370476\") " pod="metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8" Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.248672 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpw6f\" (UniqueName: \"kubernetes.io/projected/93d50c8f-84bf-4f97-96ab-98cbbd370476-kube-api-access-tpw6f\") pod \"metallb-operator-webhook-server-7b74cd5c78-9s4s8\" (UID: \"93d50c8f-84bf-4f97-96ab-98cbbd370476\") " pod="metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8" Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.372381 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8" Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.584334 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp"] Dec 16 13:00:34 crc kubenswrapper[4757]: W1216 13:00:34.595720 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod019f84e1_6fee_4829_a087_c756c955060a.slice/crio-40e7f165306e20952134bcc6800706ca6ab48c845a3ee56430c4aca85c1ec7db WatchSource:0}: Error finding container 40e7f165306e20952134bcc6800706ca6ab48c845a3ee56430c4aca85c1ec7db: Status 404 returned error can't find the container with id 40e7f165306e20952134bcc6800706ca6ab48c845a3ee56430c4aca85c1ec7db Dec 16 13:00:34 crc kubenswrapper[4757]: I1216 13:00:34.760515 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" event={"ID":"019f84e1-6fee-4829-a087-c756c955060a","Type":"ContainerStarted","Data":"40e7f165306e20952134bcc6800706ca6ab48c845a3ee56430c4aca85c1ec7db"} Dec 16 13:00:35 crc kubenswrapper[4757]: I1216 13:00:35.038517 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8"] Dec 16 13:00:35 crc kubenswrapper[4757]: I1216 13:00:35.767209 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8" event={"ID":"93d50c8f-84bf-4f97-96ab-98cbbd370476","Type":"ContainerStarted","Data":"a3a55cd10dcc7df2942f4528573c7a65a37cb39f8d8b0c5eb980be622d9e3441"} Dec 16 13:00:40 crc kubenswrapper[4757]: I1216 13:00:40.851133 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d7p5t"] Dec 16 13:00:40 crc kubenswrapper[4757]: I1216 13:00:40.854182 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7p5t" Dec 16 13:00:40 crc kubenswrapper[4757]: I1216 13:00:40.869398 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d7p5t"] Dec 16 13:00:40 crc kubenswrapper[4757]: I1216 13:00:40.912896 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dd37c9f-f384-425b-9b1c-25032a31ba77-catalog-content\") pod \"certified-operators-d7p5t\" (UID: \"2dd37c9f-f384-425b-9b1c-25032a31ba77\") " pod="openshift-marketplace/certified-operators-d7p5t" Dec 16 13:00:40 crc kubenswrapper[4757]: I1216 13:00:40.912995 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qccq\" (UniqueName: \"kubernetes.io/projected/2dd37c9f-f384-425b-9b1c-25032a31ba77-kube-api-access-6qccq\") pod \"certified-operators-d7p5t\" (UID: \"2dd37c9f-f384-425b-9b1c-25032a31ba77\") " pod="openshift-marketplace/certified-operators-d7p5t" Dec 16 13:00:40 crc kubenswrapper[4757]: I1216 13:00:40.913051 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dd37c9f-f384-425b-9b1c-25032a31ba77-utilities\") pod \"certified-operators-d7p5t\" (UID: \"2dd37c9f-f384-425b-9b1c-25032a31ba77\") " pod="openshift-marketplace/certified-operators-d7p5t" Dec 16 13:00:41 crc kubenswrapper[4757]: I1216 13:00:41.013506 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qccq\" (UniqueName: \"kubernetes.io/projected/2dd37c9f-f384-425b-9b1c-25032a31ba77-kube-api-access-6qccq\") pod \"certified-operators-d7p5t\" (UID: \"2dd37c9f-f384-425b-9b1c-25032a31ba77\") " pod="openshift-marketplace/certified-operators-d7p5t" Dec 16 13:00:41 crc kubenswrapper[4757]: I1216 13:00:41.019250 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dd37c9f-f384-425b-9b1c-25032a31ba77-utilities\") pod \"certified-operators-d7p5t\" (UID: \"2dd37c9f-f384-425b-9b1c-25032a31ba77\") " pod="openshift-marketplace/certified-operators-d7p5t" Dec 16 13:00:41 crc kubenswrapper[4757]: I1216 13:00:41.019418 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dd37c9f-f384-425b-9b1c-25032a31ba77-catalog-content\") pod \"certified-operators-d7p5t\" (UID: \"2dd37c9f-f384-425b-9b1c-25032a31ba77\") " pod="openshift-marketplace/certified-operators-d7p5t" Dec 16 13:00:41 crc kubenswrapper[4757]: I1216 13:00:41.019760 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dd37c9f-f384-425b-9b1c-25032a31ba77-utilities\") pod \"certified-operators-d7p5t\" (UID: \"2dd37c9f-f384-425b-9b1c-25032a31ba77\") " pod="openshift-marketplace/certified-operators-d7p5t" Dec 16 13:00:41 crc kubenswrapper[4757]: I1216 13:00:41.019813 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dd37c9f-f384-425b-9b1c-25032a31ba77-catalog-content\") pod \"certified-operators-d7p5t\" (UID: \"2dd37c9f-f384-425b-9b1c-25032a31ba77\") " pod="openshift-marketplace/certified-operators-d7p5t" Dec 16 13:00:41 crc kubenswrapper[4757]: I1216 13:00:41.044213 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qccq\" (UniqueName: \"kubernetes.io/projected/2dd37c9f-f384-425b-9b1c-25032a31ba77-kube-api-access-6qccq\") pod \"certified-operators-d7p5t\" (UID: \"2dd37c9f-f384-425b-9b1c-25032a31ba77\") " pod="openshift-marketplace/certified-operators-d7p5t" Dec 16 13:00:41 crc kubenswrapper[4757]: I1216 13:00:41.182988 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7p5t" Dec 16 13:00:41 crc kubenswrapper[4757]: I1216 13:00:41.821310 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" event={"ID":"019f84e1-6fee-4829-a087-c756c955060a","Type":"ContainerStarted","Data":"3e0be9ae6845db7ec969b655807af38b8cc48f14eba930237001d942424668c3"} Dec 16 13:00:41 crc kubenswrapper[4757]: I1216 13:00:41.821622 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" Dec 16 13:00:41 crc kubenswrapper[4757]: I1216 13:00:41.854867 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" podStartSLOduration=1.963515662 podStartE2EDuration="8.85484747s" podCreationTimestamp="2025-12-16 13:00:33 +0000 UTC" firstStartedPulling="2025-12-16 13:00:34.598637561 +0000 UTC m=+820.026381357" lastFinishedPulling="2025-12-16 13:00:41.489969369 +0000 UTC m=+826.917713165" observedRunningTime="2025-12-16 13:00:41.852151262 +0000 UTC m=+827.279895058" watchObservedRunningTime="2025-12-16 13:00:41.85484747 +0000 UTC m=+827.282591266" Dec 16 13:00:42 crc kubenswrapper[4757]: I1216 13:00:42.051655 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d7p5t"] Dec 16 13:00:42 crc kubenswrapper[4757]: I1216 13:00:42.829319 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8" event={"ID":"93d50c8f-84bf-4f97-96ab-98cbbd370476","Type":"ContainerStarted","Data":"a8df22159337ad9b8327bcb743d17c0e26e149e233d4bfae4902b6b47a8d6a37"} Dec 16 13:00:42 crc kubenswrapper[4757]: I1216 13:00:42.829455 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8" Dec 16 13:00:42 crc kubenswrapper[4757]: I1216 13:00:42.831663 4757 generic.go:334] "Generic (PLEG): container finished" podID="2dd37c9f-f384-425b-9b1c-25032a31ba77" containerID="c9e0e62844d2f9b9489e3b54a4b4a40fc88fcb8bfbe5bd1d212a0db7a73af36d" exitCode=0 Dec 16 13:00:42 crc kubenswrapper[4757]: I1216 13:00:42.831788 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7p5t" event={"ID":"2dd37c9f-f384-425b-9b1c-25032a31ba77","Type":"ContainerDied","Data":"c9e0e62844d2f9b9489e3b54a4b4a40fc88fcb8bfbe5bd1d212a0db7a73af36d"} Dec 16 13:00:42 crc kubenswrapper[4757]: I1216 13:00:42.831858 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7p5t" event={"ID":"2dd37c9f-f384-425b-9b1c-25032a31ba77","Type":"ContainerStarted","Data":"25e6325fbe8d33b8a49b3b4c009075b916c36d1627cabfd126bb0d863485d7af"} Dec 16 13:00:42 crc kubenswrapper[4757]: I1216 13:00:42.858828 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8" podStartSLOduration=2.2702856049999998 podStartE2EDuration="8.858807716s" podCreationTimestamp="2025-12-16 13:00:34 +0000 UTC" firstStartedPulling="2025-12-16 13:00:35.058265366 +0000 UTC m=+820.486009162" lastFinishedPulling="2025-12-16 13:00:41.646787477 +0000 UTC m=+827.074531273" observedRunningTime="2025-12-16 13:00:42.857632957 +0000 UTC m=+828.285376773" watchObservedRunningTime="2025-12-16 13:00:42.858807716 +0000 UTC m=+828.286551512" Dec 16 13:00:45 crc kubenswrapper[4757]: I1216 13:00:45.853548 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7p5t" event={"ID":"2dd37c9f-f384-425b-9b1c-25032a31ba77","Type":"ContainerStarted","Data":"21c2fcfa2165fceb86e329554e902d635b3cdbbf1c281e0a97ad799fbe981377"} Dec 16 13:00:46 crc kubenswrapper[4757]: I1216 13:00:46.871959 4757 generic.go:334] "Generic (PLEG): container finished" podID="2dd37c9f-f384-425b-9b1c-25032a31ba77" containerID="21c2fcfa2165fceb86e329554e902d635b3cdbbf1c281e0a97ad799fbe981377" exitCode=0 Dec 16 13:00:46 crc kubenswrapper[4757]: I1216 13:00:46.872032 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7p5t" event={"ID":"2dd37c9f-f384-425b-9b1c-25032a31ba77","Type":"ContainerDied","Data":"21c2fcfa2165fceb86e329554e902d635b3cdbbf1c281e0a97ad799fbe981377"} Dec 16 13:00:47 crc kubenswrapper[4757]: I1216 13:00:47.879740 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7p5t" event={"ID":"2dd37c9f-f384-425b-9b1c-25032a31ba77","Type":"ContainerStarted","Data":"c030f9eba9daddc98dbeb15da3643959cc32b131f0d84dac77da85cf3acdfd3f"} Dec 16 13:00:47 crc kubenswrapper[4757]: I1216 13:00:47.907312 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d7p5t" podStartSLOduration=3.458862988 podStartE2EDuration="7.907293784s" podCreationTimestamp="2025-12-16 13:00:40 +0000 UTC" firstStartedPulling="2025-12-16 13:00:42.833876697 +0000 UTC m=+828.261620503" lastFinishedPulling="2025-12-16 13:00:47.282307503 +0000 UTC m=+832.710051299" observedRunningTime="2025-12-16 13:00:47.905197621 +0000 UTC m=+833.332941417" watchObservedRunningTime="2025-12-16 13:00:47.907293784 +0000 UTC m=+833.335037580" Dec 16 13:00:51 crc kubenswrapper[4757]: I1216 13:00:51.184349 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d7p5t" Dec 16 13:00:51 crc kubenswrapper[4757]: I1216 13:00:51.184744 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d7p5t" Dec 16 13:00:51 crc kubenswrapper[4757]: I1216 13:00:51.224209 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d7p5t" Dec 16 13:00:54 crc kubenswrapper[4757]: I1216 13:00:54.377453 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7b74cd5c78-9s4s8" Dec 16 13:01:01 crc kubenswrapper[4757]: I1216 13:01:01.223217 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d7p5t" Dec 16 13:01:01 crc kubenswrapper[4757]: I1216 13:01:01.278155 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d7p5t"] Dec 16 13:01:01 crc kubenswrapper[4757]: I1216 13:01:01.956918 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d7p5t" podUID="2dd37c9f-f384-425b-9b1c-25032a31ba77" containerName="registry-server" containerID="cri-o://c030f9eba9daddc98dbeb15da3643959cc32b131f0d84dac77da85cf3acdfd3f" gracePeriod=2 Dec 16 13:01:03 crc kubenswrapper[4757]: I1216 13:01:03.967781 4757 generic.go:334] "Generic (PLEG): container finished" podID="2dd37c9f-f384-425b-9b1c-25032a31ba77" containerID="c030f9eba9daddc98dbeb15da3643959cc32b131f0d84dac77da85cf3acdfd3f" exitCode=0 Dec 16 13:01:03 crc kubenswrapper[4757]: I1216 13:01:03.967866 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7p5t" event={"ID":"2dd37c9f-f384-425b-9b1c-25032a31ba77","Type":"ContainerDied","Data":"c030f9eba9daddc98dbeb15da3643959cc32b131f0d84dac77da85cf3acdfd3f"} Dec 16 13:01:04 crc kubenswrapper[4757]: I1216 13:01:04.172756 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7p5t" Dec 16 13:01:04 crc kubenswrapper[4757]: I1216 13:01:04.323985 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qccq\" (UniqueName: \"kubernetes.io/projected/2dd37c9f-f384-425b-9b1c-25032a31ba77-kube-api-access-6qccq\") pod \"2dd37c9f-f384-425b-9b1c-25032a31ba77\" (UID: \"2dd37c9f-f384-425b-9b1c-25032a31ba77\") " Dec 16 13:01:04 crc kubenswrapper[4757]: I1216 13:01:04.324219 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dd37c9f-f384-425b-9b1c-25032a31ba77-utilities\") pod \"2dd37c9f-f384-425b-9b1c-25032a31ba77\" (UID: \"2dd37c9f-f384-425b-9b1c-25032a31ba77\") " Dec 16 13:01:04 crc kubenswrapper[4757]: I1216 13:01:04.324319 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dd37c9f-f384-425b-9b1c-25032a31ba77-catalog-content\") pod \"2dd37c9f-f384-425b-9b1c-25032a31ba77\" (UID: \"2dd37c9f-f384-425b-9b1c-25032a31ba77\") " Dec 16 13:01:04 crc kubenswrapper[4757]: I1216 13:01:04.325227 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dd37c9f-f384-425b-9b1c-25032a31ba77-utilities" (OuterVolumeSpecName: "utilities") pod "2dd37c9f-f384-425b-9b1c-25032a31ba77" (UID: "2dd37c9f-f384-425b-9b1c-25032a31ba77"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:01:04 crc kubenswrapper[4757]: I1216 13:01:04.343215 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dd37c9f-f384-425b-9b1c-25032a31ba77-kube-api-access-6qccq" (OuterVolumeSpecName: "kube-api-access-6qccq") pod "2dd37c9f-f384-425b-9b1c-25032a31ba77" (UID: "2dd37c9f-f384-425b-9b1c-25032a31ba77"). InnerVolumeSpecName "kube-api-access-6qccq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:01:04 crc kubenswrapper[4757]: I1216 13:01:04.387501 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dd37c9f-f384-425b-9b1c-25032a31ba77-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2dd37c9f-f384-425b-9b1c-25032a31ba77" (UID: "2dd37c9f-f384-425b-9b1c-25032a31ba77"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:01:04 crc kubenswrapper[4757]: I1216 13:01:04.425681 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dd37c9f-f384-425b-9b1c-25032a31ba77-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:01:04 crc kubenswrapper[4757]: I1216 13:01:04.425721 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qccq\" (UniqueName: \"kubernetes.io/projected/2dd37c9f-f384-425b-9b1c-25032a31ba77-kube-api-access-6qccq\") on node \"crc\" DevicePath \"\"" Dec 16 13:01:04 crc kubenswrapper[4757]: I1216 13:01:04.425734 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dd37c9f-f384-425b-9b1c-25032a31ba77-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:01:04 crc kubenswrapper[4757]: I1216 13:01:04.978511 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7p5t" event={"ID":"2dd37c9f-f384-425b-9b1c-25032a31ba77","Type":"ContainerDied","Data":"25e6325fbe8d33b8a49b3b4c009075b916c36d1627cabfd126bb0d863485d7af"} Dec 16 13:01:04 crc kubenswrapper[4757]: I1216 13:01:04.978575 4757 scope.go:117] "RemoveContainer" containerID="c030f9eba9daddc98dbeb15da3643959cc32b131f0d84dac77da85cf3acdfd3f" Dec 16 13:01:04 crc kubenswrapper[4757]: I1216 13:01:04.978729 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7p5t" Dec 16 13:01:04 crc kubenswrapper[4757]: I1216 13:01:04.993760 4757 scope.go:117] "RemoveContainer" containerID="21c2fcfa2165fceb86e329554e902d635b3cdbbf1c281e0a97ad799fbe981377" Dec 16 13:01:05 crc kubenswrapper[4757]: I1216 13:01:05.008605 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d7p5t"] Dec 16 13:01:05 crc kubenswrapper[4757]: I1216 13:01:05.015052 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d7p5t"] Dec 16 13:01:05 crc kubenswrapper[4757]: I1216 13:01:05.020255 4757 scope.go:117] "RemoveContainer" containerID="c9e0e62844d2f9b9489e3b54a4b4a40fc88fcb8bfbe5bd1d212a0db7a73af36d" Dec 16 13:01:06 crc kubenswrapper[4757]: I1216 13:01:06.959077 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dd37c9f-f384-425b-9b1c-25032a31ba77" path="/var/lib/kubelet/pods/2dd37c9f-f384-425b-9b1c-25032a31ba77/volumes" Dec 16 13:01:14 crc kubenswrapper[4757]: I1216 13:01:14.127535 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" Dec 16 13:01:14 crc kubenswrapper[4757]: I1216 13:01:14.971460 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-cwgds"] Dec 16 13:01:14 crc kubenswrapper[4757]: E1216 13:01:14.972353 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dd37c9f-f384-425b-9b1c-25032a31ba77" containerName="registry-server" Dec 16 13:01:14 crc kubenswrapper[4757]: I1216 13:01:14.972444 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd37c9f-f384-425b-9b1c-25032a31ba77" containerName="registry-server" Dec 16 13:01:14 crc kubenswrapper[4757]: E1216 13:01:14.972531 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dd37c9f-f384-425b-9b1c-25032a31ba77" containerName="extract-utilities" Dec 16 13:01:14 crc kubenswrapper[4757]: I1216 13:01:14.972599 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd37c9f-f384-425b-9b1c-25032a31ba77" containerName="extract-utilities" Dec 16 13:01:14 crc kubenswrapper[4757]: E1216 13:01:14.972677 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dd37c9f-f384-425b-9b1c-25032a31ba77" containerName="extract-content" Dec 16 13:01:14 crc kubenswrapper[4757]: I1216 13:01:14.972741 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd37c9f-f384-425b-9b1c-25032a31ba77" containerName="extract-content" Dec 16 13:01:14 crc kubenswrapper[4757]: I1216 13:01:14.972943 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dd37c9f-f384-425b-9b1c-25032a31ba77" containerName="registry-server" Dec 16 13:01:14 crc kubenswrapper[4757]: I1216 13:01:14.973461 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-dfprv"] Dec 16 13:01:14 crc kubenswrapper[4757]: I1216 13:01:14.974792 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-cwgds" Dec 16 13:01:14 crc kubenswrapper[4757]: I1216 13:01:14.976290 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-cwgds"] Dec 16 13:01:14 crc kubenswrapper[4757]: I1216 13:01:14.976863 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:14 crc kubenswrapper[4757]: I1216 13:01:14.983472 4757 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 16 13:01:14 crc kubenswrapper[4757]: I1216 13:01:14.983711 4757 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-rztwh" Dec 16 13:01:14 crc kubenswrapper[4757]: I1216 13:01:14.986365 4757 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 16 13:01:14 crc kubenswrapper[4757]: I1216 13:01:14.993624 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.053177 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lnd4\" (UniqueName: \"kubernetes.io/projected/97dcbeac-b6c6-4892-91fd-4cd45b2b1f9b-kube-api-access-6lnd4\") pod \"frr-k8s-webhook-server-7784b6fcf-cwgds\" (UID: \"97dcbeac-b6c6-4892-91fd-4cd45b2b1f9b\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-cwgds" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.053231 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0ea2db10-97ca-4173-9766-c34220e3958b-reloader\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.053250 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ea2db10-97ca-4173-9766-c34220e3958b-metrics-certs\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.053288 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0ea2db10-97ca-4173-9766-c34220e3958b-frr-sockets\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.053336 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2m8n\" (UniqueName: \"kubernetes.io/projected/0ea2db10-97ca-4173-9766-c34220e3958b-kube-api-access-n2m8n\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.053352 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97dcbeac-b6c6-4892-91fd-4cd45b2b1f9b-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-cwgds\" (UID: \"97dcbeac-b6c6-4892-91fd-4cd45b2b1f9b\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-cwgds" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.053368 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0ea2db10-97ca-4173-9766-c34220e3958b-metrics\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.053395 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0ea2db10-97ca-4173-9766-c34220e3958b-frr-startup\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.053443 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0ea2db10-97ca-4173-9766-c34220e3958b-frr-conf\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.073802 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-996kc"] Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.075844 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-996kc" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.077969 4757 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.079367 4757 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.079801 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.079979 4757 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-tbs4j" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.096583 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5bddd4b946-jwvt9"] Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.097662 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-jwvt9" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.099654 4757 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.154844 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd784875-6828-4554-8791-24182d80b82f-cert\") pod \"controller-5bddd4b946-jwvt9\" (UID: \"dd784875-6828-4554-8791-24182d80b82f\") " pod="metallb-system/controller-5bddd4b946-jwvt9" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.155864 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0ea2db10-97ca-4173-9766-c34220e3958b-frr-conf\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.155975 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8074db35-8766-4cc2-bc06-be8a150f92e9-metallb-excludel2\") pod \"speaker-996kc\" (UID: \"8074db35-8766-4cc2-bc06-be8a150f92e9\") " pod="metallb-system/speaker-996kc" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.156090 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwbzs\" (UniqueName: \"kubernetes.io/projected/dd784875-6828-4554-8791-24182d80b82f-kube-api-access-pwbzs\") pod \"controller-5bddd4b946-jwvt9\" (UID: \"dd784875-6828-4554-8791-24182d80b82f\") " pod="metallb-system/controller-5bddd4b946-jwvt9" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.156170 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmzcw\" (UniqueName: \"kubernetes.io/projected/8074db35-8766-4cc2-bc06-be8a150f92e9-kube-api-access-jmzcw\") pod \"speaker-996kc\" (UID: \"8074db35-8766-4cc2-bc06-be8a150f92e9\") " pod="metallb-system/speaker-996kc" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.156263 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lnd4\" (UniqueName: \"kubernetes.io/projected/97dcbeac-b6c6-4892-91fd-4cd45b2b1f9b-kube-api-access-6lnd4\") pod \"frr-k8s-webhook-server-7784b6fcf-cwgds\" (UID: \"97dcbeac-b6c6-4892-91fd-4cd45b2b1f9b\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-cwgds" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.156349 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8074db35-8766-4cc2-bc06-be8a150f92e9-memberlist\") pod \"speaker-996kc\" (UID: \"8074db35-8766-4cc2-bc06-be8a150f92e9\") " pod="metallb-system/speaker-996kc" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.156436 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd784875-6828-4554-8791-24182d80b82f-metrics-certs\") pod \"controller-5bddd4b946-jwvt9\" (UID: \"dd784875-6828-4554-8791-24182d80b82f\") " pod="metallb-system/controller-5bddd4b946-jwvt9" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.156537 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0ea2db10-97ca-4173-9766-c34220e3958b-reloader\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.156647 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ea2db10-97ca-4173-9766-c34220e3958b-metrics-certs\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.156737 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0ea2db10-97ca-4173-9766-c34220e3958b-frr-sockets\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.156845 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2m8n\" (UniqueName: \"kubernetes.io/projected/0ea2db10-97ca-4173-9766-c34220e3958b-kube-api-access-n2m8n\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.156942 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0ea2db10-97ca-4173-9766-c34220e3958b-metrics\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.157148 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97dcbeac-b6c6-4892-91fd-4cd45b2b1f9b-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-cwgds\" (UID: \"97dcbeac-b6c6-4892-91fd-4cd45b2b1f9b\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-cwgds" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.157263 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0ea2db10-97ca-4173-9766-c34220e3958b-frr-startup\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.157363 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8074db35-8766-4cc2-bc06-be8a150f92e9-metrics-certs\") pod \"speaker-996kc\" (UID: \"8074db35-8766-4cc2-bc06-be8a150f92e9\") " pod="metallb-system/speaker-996kc" Dec 16 13:01:15 crc kubenswrapper[4757]: E1216 13:01:15.157633 4757 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 16 13:01:15 crc kubenswrapper[4757]: E1216 13:01:15.157753 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ea2db10-97ca-4173-9766-c34220e3958b-metrics-certs podName:0ea2db10-97ca-4173-9766-c34220e3958b nodeName:}" failed. No retries permitted until 2025-12-16 13:01:15.657735803 +0000 UTC m=+861.085479599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0ea2db10-97ca-4173-9766-c34220e3958b-metrics-certs") pod "frr-k8s-dfprv" (UID: "0ea2db10-97ca-4173-9766-c34220e3958b") : secret "frr-k8s-certs-secret" not found Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.160065 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0ea2db10-97ca-4173-9766-c34220e3958b-frr-conf\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.161413 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-jwvt9"] Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.161607 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0ea2db10-97ca-4173-9766-c34220e3958b-reloader\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.172076 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97dcbeac-b6c6-4892-91fd-4cd45b2b1f9b-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-cwgds\" (UID: \"97dcbeac-b6c6-4892-91fd-4cd45b2b1f9b\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-cwgds" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.176150 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0ea2db10-97ca-4173-9766-c34220e3958b-frr-sockets\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.176150 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0ea2db10-97ca-4173-9766-c34220e3958b-metrics\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.177043 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0ea2db10-97ca-4173-9766-c34220e3958b-frr-startup\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.179649 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2m8n\" (UniqueName: \"kubernetes.io/projected/0ea2db10-97ca-4173-9766-c34220e3958b-kube-api-access-n2m8n\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.207088 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lnd4\" (UniqueName: \"kubernetes.io/projected/97dcbeac-b6c6-4892-91fd-4cd45b2b1f9b-kube-api-access-6lnd4\") pod \"frr-k8s-webhook-server-7784b6fcf-cwgds\" (UID: \"97dcbeac-b6c6-4892-91fd-4cd45b2b1f9b\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-cwgds" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.259088 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd784875-6828-4554-8791-24182d80b82f-cert\") pod \"controller-5bddd4b946-jwvt9\" (UID: \"dd784875-6828-4554-8791-24182d80b82f\") " pod="metallb-system/controller-5bddd4b946-jwvt9" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.259172 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8074db35-8766-4cc2-bc06-be8a150f92e9-metallb-excludel2\") pod \"speaker-996kc\" (UID: \"8074db35-8766-4cc2-bc06-be8a150f92e9\") " pod="metallb-system/speaker-996kc" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.259204 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwbzs\" (UniqueName: \"kubernetes.io/projected/dd784875-6828-4554-8791-24182d80b82f-kube-api-access-pwbzs\") pod \"controller-5bddd4b946-jwvt9\" (UID: \"dd784875-6828-4554-8791-24182d80b82f\") " pod="metallb-system/controller-5bddd4b946-jwvt9" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.259231 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmzcw\" (UniqueName: \"kubernetes.io/projected/8074db35-8766-4cc2-bc06-be8a150f92e9-kube-api-access-jmzcw\") pod \"speaker-996kc\" (UID: \"8074db35-8766-4cc2-bc06-be8a150f92e9\") " pod="metallb-system/speaker-996kc" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.259267 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8074db35-8766-4cc2-bc06-be8a150f92e9-memberlist\") pod \"speaker-996kc\" (UID: \"8074db35-8766-4cc2-bc06-be8a150f92e9\") " pod="metallb-system/speaker-996kc" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.259292 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd784875-6828-4554-8791-24182d80b82f-metrics-certs\") pod \"controller-5bddd4b946-jwvt9\" (UID: \"dd784875-6828-4554-8791-24182d80b82f\") " pod="metallb-system/controller-5bddd4b946-jwvt9" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.259349 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8074db35-8766-4cc2-bc06-be8a150f92e9-metrics-certs\") pod \"speaker-996kc\" (UID: \"8074db35-8766-4cc2-bc06-be8a150f92e9\") " pod="metallb-system/speaker-996kc" Dec 16 13:01:15 crc kubenswrapper[4757]: E1216 13:01:15.259513 4757 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 16 13:01:15 crc kubenswrapper[4757]: E1216 13:01:15.259583 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8074db35-8766-4cc2-bc06-be8a150f92e9-metrics-certs podName:8074db35-8766-4cc2-bc06-be8a150f92e9 nodeName:}" failed. No retries permitted until 2025-12-16 13:01:15.759565399 +0000 UTC m=+861.187309195 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8074db35-8766-4cc2-bc06-be8a150f92e9-metrics-certs") pod "speaker-996kc" (UID: "8074db35-8766-4cc2-bc06-be8a150f92e9") : secret "speaker-certs-secret" not found Dec 16 13:01:15 crc kubenswrapper[4757]: E1216 13:01:15.259893 4757 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 16 13:01:15 crc kubenswrapper[4757]: E1216 13:01:15.259927 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8074db35-8766-4cc2-bc06-be8a150f92e9-memberlist podName:8074db35-8766-4cc2-bc06-be8a150f92e9 nodeName:}" failed. No retries permitted until 2025-12-16 13:01:15.759917658 +0000 UTC m=+861.187661454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8074db35-8766-4cc2-bc06-be8a150f92e9-memberlist") pod "speaker-996kc" (UID: "8074db35-8766-4cc2-bc06-be8a150f92e9") : secret "metallb-memberlist" not found Dec 16 13:01:15 crc kubenswrapper[4757]: E1216 13:01:15.260023 4757 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 16 13:01:15 crc kubenswrapper[4757]: E1216 13:01:15.260110 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd784875-6828-4554-8791-24182d80b82f-metrics-certs podName:dd784875-6828-4554-8791-24182d80b82f nodeName:}" failed. No retries permitted until 2025-12-16 13:01:15.760086432 +0000 UTC m=+861.187830268 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd784875-6828-4554-8791-24182d80b82f-metrics-certs") pod "controller-5bddd4b946-jwvt9" (UID: "dd784875-6828-4554-8791-24182d80b82f") : secret "controller-certs-secret" not found Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.260626 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8074db35-8766-4cc2-bc06-be8a150f92e9-metallb-excludel2\") pod \"speaker-996kc\" (UID: \"8074db35-8766-4cc2-bc06-be8a150f92e9\") " pod="metallb-system/speaker-996kc" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.261871 4757 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.278490 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd784875-6828-4554-8791-24182d80b82f-cert\") pod \"controller-5bddd4b946-jwvt9\" (UID: \"dd784875-6828-4554-8791-24182d80b82f\") " pod="metallb-system/controller-5bddd4b946-jwvt9" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.281936 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwbzs\" (UniqueName: \"kubernetes.io/projected/dd784875-6828-4554-8791-24182d80b82f-kube-api-access-pwbzs\") pod \"controller-5bddd4b946-jwvt9\" (UID: \"dd784875-6828-4554-8791-24182d80b82f\") " pod="metallb-system/controller-5bddd4b946-jwvt9" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.282611 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmzcw\" (UniqueName: \"kubernetes.io/projected/8074db35-8766-4cc2-bc06-be8a150f92e9-kube-api-access-jmzcw\") pod \"speaker-996kc\" (UID: \"8074db35-8766-4cc2-bc06-be8a150f92e9\") " pod="metallb-system/speaker-996kc" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.291452 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-cwgds" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.537705 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-cwgds"] Dec 16 13:01:15 crc kubenswrapper[4757]: W1216 13:01:15.543374 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97dcbeac_b6c6_4892_91fd_4cd45b2b1f9b.slice/crio-eae71eb2662169e908ed4b25632455143480c36cba782afc1e50db00b8667a64 WatchSource:0}: Error finding container eae71eb2662169e908ed4b25632455143480c36cba782afc1e50db00b8667a64: Status 404 returned error can't find the container with id eae71eb2662169e908ed4b25632455143480c36cba782afc1e50db00b8667a64 Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.663810 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ea2db10-97ca-4173-9766-c34220e3958b-metrics-certs\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.667669 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ea2db10-97ca-4173-9766-c34220e3958b-metrics-certs\") pod \"frr-k8s-dfprv\" (UID: \"0ea2db10-97ca-4173-9766-c34220e3958b\") " pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.765829 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8074db35-8766-4cc2-bc06-be8a150f92e9-memberlist\") pod \"speaker-996kc\" (UID: \"8074db35-8766-4cc2-bc06-be8a150f92e9\") " pod="metallb-system/speaker-996kc" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.765910 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd784875-6828-4554-8791-24182d80b82f-metrics-certs\") pod \"controller-5bddd4b946-jwvt9\" (UID: \"dd784875-6828-4554-8791-24182d80b82f\") " pod="metallb-system/controller-5bddd4b946-jwvt9" Dec 16 13:01:15 crc kubenswrapper[4757]: E1216 13:01:15.765963 4757 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.765980 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8074db35-8766-4cc2-bc06-be8a150f92e9-metrics-certs\") pod \"speaker-996kc\" (UID: \"8074db35-8766-4cc2-bc06-be8a150f92e9\") " pod="metallb-system/speaker-996kc" Dec 16 13:01:15 crc kubenswrapper[4757]: E1216 13:01:15.766046 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8074db35-8766-4cc2-bc06-be8a150f92e9-memberlist podName:8074db35-8766-4cc2-bc06-be8a150f92e9 nodeName:}" failed. No retries permitted until 2025-12-16 13:01:16.766026966 +0000 UTC m=+862.193770762 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8074db35-8766-4cc2-bc06-be8a150f92e9-memberlist") pod "speaker-996kc" (UID: "8074db35-8766-4cc2-bc06-be8a150f92e9") : secret "metallb-memberlist" not found Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.768909 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd784875-6828-4554-8791-24182d80b82f-metrics-certs\") pod \"controller-5bddd4b946-jwvt9\" (UID: \"dd784875-6828-4554-8791-24182d80b82f\") " pod="metallb-system/controller-5bddd4b946-jwvt9" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.768986 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8074db35-8766-4cc2-bc06-be8a150f92e9-metrics-certs\") pod \"speaker-996kc\" (UID: \"8074db35-8766-4cc2-bc06-be8a150f92e9\") " pod="metallb-system/speaker-996kc" Dec 16 13:01:15 crc kubenswrapper[4757]: I1216 13:01:15.898193 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:16 crc kubenswrapper[4757]: I1216 13:01:16.014517 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-jwvt9" Dec 16 13:01:16 crc kubenswrapper[4757]: I1216 13:01:16.045360 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-cwgds" event={"ID":"97dcbeac-b6c6-4892-91fd-4cd45b2b1f9b","Type":"ContainerStarted","Data":"eae71eb2662169e908ed4b25632455143480c36cba782afc1e50db00b8667a64"} Dec 16 13:01:16 crc kubenswrapper[4757]: I1216 13:01:16.046678 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dfprv" event={"ID":"0ea2db10-97ca-4173-9766-c34220e3958b","Type":"ContainerStarted","Data":"1c7ba140fd108c742c3edf04672178b8cf9374822b5730b8242907bcd13b6037"} Dec 16 13:01:16 crc kubenswrapper[4757]: I1216 13:01:16.344528 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-jwvt9"] Dec 16 13:01:16 crc kubenswrapper[4757]: I1216 13:01:16.777460 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8074db35-8766-4cc2-bc06-be8a150f92e9-memberlist\") pod \"speaker-996kc\" (UID: \"8074db35-8766-4cc2-bc06-be8a150f92e9\") " pod="metallb-system/speaker-996kc" Dec 16 13:01:16 crc kubenswrapper[4757]: I1216 13:01:16.789273 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8074db35-8766-4cc2-bc06-be8a150f92e9-memberlist\") pod \"speaker-996kc\" (UID: \"8074db35-8766-4cc2-bc06-be8a150f92e9\") " pod="metallb-system/speaker-996kc" Dec 16 13:01:16 crc kubenswrapper[4757]: I1216 13:01:16.888627 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-996kc" Dec 16 13:01:17 crc kubenswrapper[4757]: I1216 13:01:17.074242 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-996kc" event={"ID":"8074db35-8766-4cc2-bc06-be8a150f92e9","Type":"ContainerStarted","Data":"8d07b07ece4f68ca7b590609a368da5fb0043a9dc76b439e3cbf6f0168714bd3"} Dec 16 13:01:17 crc kubenswrapper[4757]: I1216 13:01:17.092754 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-jwvt9" event={"ID":"dd784875-6828-4554-8791-24182d80b82f","Type":"ContainerStarted","Data":"09a029f4dc261d05399aac88c0a06925c1a12e04bb28b96a315d158536b9f482"} Dec 16 13:01:17 crc kubenswrapper[4757]: I1216 13:01:17.092805 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-jwvt9" event={"ID":"dd784875-6828-4554-8791-24182d80b82f","Type":"ContainerStarted","Data":"b77072c40339d3c5b1829219ad657c97a44e8babd6142de5186c5533f124ef84"} Dec 16 13:01:17 crc kubenswrapper[4757]: I1216 13:01:17.092816 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-jwvt9" event={"ID":"dd784875-6828-4554-8791-24182d80b82f","Type":"ContainerStarted","Data":"fbb1d9926ee17ac346874ee4ba5f8343f676ed981bb6b5ac28f71b3a9a8a7270"} Dec 16 13:01:17 crc kubenswrapper[4757]: I1216 13:01:17.093738 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5bddd4b946-jwvt9" Dec 16 13:01:17 crc kubenswrapper[4757]: I1216 13:01:17.129197 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5bddd4b946-jwvt9" podStartSLOduration=2.129178888 podStartE2EDuration="2.129178888s" podCreationTimestamp="2025-12-16 13:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:01:17.126320725 +0000 UTC m=+862.554064541" watchObservedRunningTime="2025-12-16 13:01:17.129178888 +0000 UTC m=+862.556922684" Dec 16 13:01:18 crc kubenswrapper[4757]: I1216 13:01:18.117941 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-996kc" event={"ID":"8074db35-8766-4cc2-bc06-be8a150f92e9","Type":"ContainerStarted","Data":"c4dc2c113fb16459b92f678fe2bb149f5ebb0d4aee45f5e3c4a6527a87fd4bfa"} Dec 16 13:01:19 crc kubenswrapper[4757]: I1216 13:01:19.130679 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-996kc" event={"ID":"8074db35-8766-4cc2-bc06-be8a150f92e9","Type":"ContainerStarted","Data":"c780905e1d6baf698c68f3e0d28a0161210ea2666222b1efe995786c1d002131"} Dec 16 13:01:19 crc kubenswrapper[4757]: I1216 13:01:19.130734 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-996kc" Dec 16 13:01:19 crc kubenswrapper[4757]: I1216 13:01:19.153309 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-996kc" podStartSLOduration=4.153291335 podStartE2EDuration="4.153291335s" podCreationTimestamp="2025-12-16 13:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:01:19.148278907 +0000 UTC m=+864.576022703" watchObservedRunningTime="2025-12-16 13:01:19.153291335 +0000 UTC m=+864.581035131" Dec 16 13:01:21 crc kubenswrapper[4757]: I1216 13:01:21.181285 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:01:21 crc kubenswrapper[4757]: I1216 13:01:21.181626 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:01:24 crc kubenswrapper[4757]: I1216 13:01:24.166295 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-cwgds" event={"ID":"97dcbeac-b6c6-4892-91fd-4cd45b2b1f9b","Type":"ContainerStarted","Data":"f64ae969e44823a5158e0ecd951408e108d44ec4b0d2c8c9532bc44368189324"} Dec 16 13:01:24 crc kubenswrapper[4757]: I1216 13:01:24.166982 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-cwgds" Dec 16 13:01:24 crc kubenswrapper[4757]: I1216 13:01:24.168361 4757 generic.go:334] "Generic (PLEG): container finished" podID="0ea2db10-97ca-4173-9766-c34220e3958b" containerID="853bf267b96ef2d2b49adf94ca8418f4f2d5c9f1e5d55e4de5c54eb889a44ebc" exitCode=0 Dec 16 13:01:24 crc kubenswrapper[4757]: I1216 13:01:24.168405 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dfprv" event={"ID":"0ea2db10-97ca-4173-9766-c34220e3958b","Type":"ContainerDied","Data":"853bf267b96ef2d2b49adf94ca8418f4f2d5c9f1e5d55e4de5c54eb889a44ebc"} Dec 16 13:01:24 crc kubenswrapper[4757]: I1216 13:01:24.188952 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-cwgds" podStartSLOduration=2.093318622 podStartE2EDuration="10.188931062s" podCreationTimestamp="2025-12-16 13:01:14 +0000 UTC" firstStartedPulling="2025-12-16 13:01:15.546838857 +0000 UTC m=+860.974582653" lastFinishedPulling="2025-12-16 13:01:23.642451297 +0000 UTC m=+869.070195093" observedRunningTime="2025-12-16 13:01:24.184604981 +0000 UTC m=+869.612348787" watchObservedRunningTime="2025-12-16 13:01:24.188931062 +0000 UTC m=+869.616674858" Dec 16 13:01:25 crc kubenswrapper[4757]: I1216 13:01:25.175511 4757 generic.go:334] "Generic (PLEG): container finished" podID="0ea2db10-97ca-4173-9766-c34220e3958b" containerID="7312c5bcf4a3d3fc9581d5224bd5f1e37e5da1ef8574eeeafb5bfccbd21f1d5f" exitCode=0 Dec 16 13:01:25 crc kubenswrapper[4757]: I1216 13:01:25.175709 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dfprv" event={"ID":"0ea2db10-97ca-4173-9766-c34220e3958b","Type":"ContainerDied","Data":"7312c5bcf4a3d3fc9581d5224bd5f1e37e5da1ef8574eeeafb5bfccbd21f1d5f"} Dec 16 13:01:26 crc kubenswrapper[4757]: I1216 13:01:26.022205 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5bddd4b946-jwvt9" Dec 16 13:01:26 crc kubenswrapper[4757]: I1216 13:01:26.182210 4757 generic.go:334] "Generic (PLEG): container finished" podID="0ea2db10-97ca-4173-9766-c34220e3958b" containerID="8d4cd14a5e35a797251412bd55f05bba7b51cd97b704c1beca7d075562194150" exitCode=0 Dec 16 13:01:26 crc kubenswrapper[4757]: I1216 13:01:26.182272 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dfprv" event={"ID":"0ea2db10-97ca-4173-9766-c34220e3958b","Type":"ContainerDied","Data":"8d4cd14a5e35a797251412bd55f05bba7b51cd97b704c1beca7d075562194150"} Dec 16 13:01:27 crc kubenswrapper[4757]: I1216 13:01:27.199917 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dfprv" event={"ID":"0ea2db10-97ca-4173-9766-c34220e3958b","Type":"ContainerStarted","Data":"bb8ad56e6bce9dd05ae24ff801a02e5f055c27117b16f89c9cc2fd5bad68be62"} Dec 16 13:01:27 crc kubenswrapper[4757]: I1216 13:01:27.200306 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dfprv" event={"ID":"0ea2db10-97ca-4173-9766-c34220e3958b","Type":"ContainerStarted","Data":"1398ab61aa2774d7a16b8c3bd3e23f957897c993439653c79aee4eb721f60e77"} Dec 16 13:01:27 crc kubenswrapper[4757]: I1216 13:01:27.200323 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dfprv" event={"ID":"0ea2db10-97ca-4173-9766-c34220e3958b","Type":"ContainerStarted","Data":"c013cb2f336aa1b39ca19f09f146c5ded234515bcba88aa155b4e37bc805022e"} Dec 16 13:01:27 crc kubenswrapper[4757]: I1216 13:01:27.200350 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dfprv" event={"ID":"0ea2db10-97ca-4173-9766-c34220e3958b","Type":"ContainerStarted","Data":"bdf926a3b8b1c364d6aa33a3f8c65c668a1faf38bf5446aa097c5f9e9f8da651"} Dec 16 13:01:27 crc kubenswrapper[4757]: I1216 13:01:27.200362 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dfprv" event={"ID":"0ea2db10-97ca-4173-9766-c34220e3958b","Type":"ContainerStarted","Data":"8069224c7ec2793584460de6771c416b77d0df2eb9b4bb5104d8ee9310d1e17d"} Dec 16 13:01:27 crc kubenswrapper[4757]: I1216 13:01:27.200371 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dfprv" event={"ID":"0ea2db10-97ca-4173-9766-c34220e3958b","Type":"ContainerStarted","Data":"2b0541ea7cbecb555d233c3755b84603ca7b56266a20b04ad5f52d0d0b731f9b"} Dec 16 13:01:27 crc kubenswrapper[4757]: I1216 13:01:27.201288 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:27 crc kubenswrapper[4757]: I1216 13:01:27.229213 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-dfprv" podStartSLOduration=5.604152004 podStartE2EDuration="13.229190608s" podCreationTimestamp="2025-12-16 13:01:14 +0000 UTC" firstStartedPulling="2025-12-16 13:01:16.03403528 +0000 UTC m=+861.461779076" lastFinishedPulling="2025-12-16 13:01:23.659073884 +0000 UTC m=+869.086817680" observedRunningTime="2025-12-16 13:01:27.223973384 +0000 UTC m=+872.651717200" watchObservedRunningTime="2025-12-16 13:01:27.229190608 +0000 UTC m=+872.656934404" Dec 16 13:01:30 crc kubenswrapper[4757]: I1216 13:01:30.898864 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:30 crc kubenswrapper[4757]: I1216 13:01:30.936092 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:35 crc kubenswrapper[4757]: I1216 13:01:35.297215 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-cwgds" Dec 16 13:01:36 crc kubenswrapper[4757]: I1216 13:01:36.894295 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-996kc" Dec 16 13:01:39 crc kubenswrapper[4757]: I1216 13:01:39.789086 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-cn9nk"] Dec 16 13:01:39 crc kubenswrapper[4757]: I1216 13:01:39.789810 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cn9nk" Dec 16 13:01:39 crc kubenswrapper[4757]: I1216 13:01:39.792171 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 16 13:01:39 crc kubenswrapper[4757]: I1216 13:01:39.793386 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 16 13:01:39 crc kubenswrapper[4757]: I1216 13:01:39.793867 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-fqgw8" Dec 16 13:01:39 crc kubenswrapper[4757]: I1216 13:01:39.798371 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrm6n\" (UniqueName: \"kubernetes.io/projected/b56f5192-de72-41a1-b733-edd456541eda-kube-api-access-lrm6n\") pod \"openstack-operator-index-cn9nk\" (UID: \"b56f5192-de72-41a1-b733-edd456541eda\") " pod="openstack-operators/openstack-operator-index-cn9nk" Dec 16 13:01:39 crc kubenswrapper[4757]: I1216 13:01:39.814431 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cn9nk"] Dec 16 13:01:39 crc kubenswrapper[4757]: I1216 13:01:39.900093 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrm6n\" (UniqueName: \"kubernetes.io/projected/b56f5192-de72-41a1-b733-edd456541eda-kube-api-access-lrm6n\") pod \"openstack-operator-index-cn9nk\" (UID: \"b56f5192-de72-41a1-b733-edd456541eda\") " pod="openstack-operators/openstack-operator-index-cn9nk" Dec 16 13:01:39 crc kubenswrapper[4757]: I1216 13:01:39.919134 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrm6n\" (UniqueName: \"kubernetes.io/projected/b56f5192-de72-41a1-b733-edd456541eda-kube-api-access-lrm6n\") pod \"openstack-operator-index-cn9nk\" (UID: \"b56f5192-de72-41a1-b733-edd456541eda\") " pod="openstack-operators/openstack-operator-index-cn9nk" Dec 16 13:01:40 crc kubenswrapper[4757]: I1216 13:01:40.109304 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cn9nk" Dec 16 13:01:41 crc kubenswrapper[4757]: I1216 13:01:40.540602 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cn9nk"] Dec 16 13:01:41 crc kubenswrapper[4757]: I1216 13:01:41.289852 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cn9nk" event={"ID":"b56f5192-de72-41a1-b733-edd456541eda","Type":"ContainerStarted","Data":"0cd95957854c0d57101fb54ec76d4eb57e9d98f26f4934d34dd1e0bdb062ba1f"} Dec 16 13:01:45 crc kubenswrapper[4757]: I1216 13:01:45.322577 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cn9nk" event={"ID":"b56f5192-de72-41a1-b733-edd456541eda","Type":"ContainerStarted","Data":"ac7b54f14b5c02a3f51000922c2505c92ea5829d362286ce622bfed31f2c44bc"} Dec 16 13:01:45 crc kubenswrapper[4757]: I1216 13:01:45.346322 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-cn9nk" podStartSLOduration=2.356155358 podStartE2EDuration="6.346293442s" podCreationTimestamp="2025-12-16 13:01:39 +0000 UTC" firstStartedPulling="2025-12-16 13:01:40.548787521 +0000 UTC m=+885.976531317" lastFinishedPulling="2025-12-16 13:01:44.538925605 +0000 UTC m=+889.966669401" observedRunningTime="2025-12-16 13:01:45.339383214 +0000 UTC m=+890.767127020" watchObservedRunningTime="2025-12-16 13:01:45.346293442 +0000 UTC m=+890.774037238" Dec 16 13:01:45 crc kubenswrapper[4757]: I1216 13:01:45.902848 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-dfprv" Dec 16 13:01:50 crc kubenswrapper[4757]: I1216 13:01:50.110385 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-cn9nk" Dec 16 13:01:50 crc kubenswrapper[4757]: I1216 13:01:50.110653 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-cn9nk" Dec 16 13:01:50 crc kubenswrapper[4757]: I1216 13:01:50.138127 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-cn9nk" Dec 16 13:01:50 crc kubenswrapper[4757]: I1216 13:01:50.377511 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-cn9nk" Dec 16 13:01:51 crc kubenswrapper[4757]: I1216 13:01:51.181471 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:01:51 crc kubenswrapper[4757]: I1216 13:01:51.182162 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:01:57 crc kubenswrapper[4757]: I1216 13:01:57.880528 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp"] Dec 16 13:01:57 crc kubenswrapper[4757]: I1216 13:01:57.882390 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" Dec 16 13:01:57 crc kubenswrapper[4757]: I1216 13:01:57.884121 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-h4pr4" Dec 16 13:01:57 crc kubenswrapper[4757]: I1216 13:01:57.895554 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp"] Dec 16 13:01:57 crc kubenswrapper[4757]: I1216 13:01:57.956152 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62758af3-0127-42e3-a06f-0ba9ff4452c4-bundle\") pod \"694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp\" (UID: \"62758af3-0127-42e3-a06f-0ba9ff4452c4\") " pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" Dec 16 13:01:57 crc kubenswrapper[4757]: I1216 13:01:57.956210 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62758af3-0127-42e3-a06f-0ba9ff4452c4-util\") pod \"694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp\" (UID: \"62758af3-0127-42e3-a06f-0ba9ff4452c4\") " pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" Dec 16 13:01:57 crc kubenswrapper[4757]: I1216 13:01:57.956247 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpjgb\" (UniqueName: \"kubernetes.io/projected/62758af3-0127-42e3-a06f-0ba9ff4452c4-kube-api-access-zpjgb\") pod \"694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp\" (UID: \"62758af3-0127-42e3-a06f-0ba9ff4452c4\") " pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" Dec 16 13:01:58 crc kubenswrapper[4757]: I1216 13:01:58.057030 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62758af3-0127-42e3-a06f-0ba9ff4452c4-bundle\") pod \"694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp\" (UID: \"62758af3-0127-42e3-a06f-0ba9ff4452c4\") " pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" Dec 16 13:01:58 crc kubenswrapper[4757]: I1216 13:01:58.057094 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62758af3-0127-42e3-a06f-0ba9ff4452c4-util\") pod \"694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp\" (UID: \"62758af3-0127-42e3-a06f-0ba9ff4452c4\") " pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" Dec 16 13:01:58 crc kubenswrapper[4757]: I1216 13:01:58.057132 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpjgb\" (UniqueName: \"kubernetes.io/projected/62758af3-0127-42e3-a06f-0ba9ff4452c4-kube-api-access-zpjgb\") pod \"694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp\" (UID: \"62758af3-0127-42e3-a06f-0ba9ff4452c4\") " pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" Dec 16 13:01:58 crc kubenswrapper[4757]: I1216 13:01:58.057695 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62758af3-0127-42e3-a06f-0ba9ff4452c4-bundle\") pod \"694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp\" (UID: \"62758af3-0127-42e3-a06f-0ba9ff4452c4\") " pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" Dec 16 13:01:58 crc kubenswrapper[4757]: I1216 13:01:58.057695 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62758af3-0127-42e3-a06f-0ba9ff4452c4-util\") pod \"694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp\" (UID: \"62758af3-0127-42e3-a06f-0ba9ff4452c4\") " pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" Dec 16 13:01:58 crc kubenswrapper[4757]: I1216 13:01:58.078083 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpjgb\" (UniqueName: \"kubernetes.io/projected/62758af3-0127-42e3-a06f-0ba9ff4452c4-kube-api-access-zpjgb\") pod \"694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp\" (UID: \"62758af3-0127-42e3-a06f-0ba9ff4452c4\") " pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" Dec 16 13:01:58 crc kubenswrapper[4757]: I1216 13:01:58.198461 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" Dec 16 13:01:58 crc kubenswrapper[4757]: I1216 13:01:58.630357 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp"] Dec 16 13:01:58 crc kubenswrapper[4757]: W1216 13:01:58.639243 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62758af3_0127_42e3_a06f_0ba9ff4452c4.slice/crio-bea90ed64db9e604400e23e6ef71b6807cb49647fd14fb0b71cfd06e8ec29b04 WatchSource:0}: Error finding container bea90ed64db9e604400e23e6ef71b6807cb49647fd14fb0b71cfd06e8ec29b04: Status 404 returned error can't find the container with id bea90ed64db9e604400e23e6ef71b6807cb49647fd14fb0b71cfd06e8ec29b04 Dec 16 13:01:59 crc kubenswrapper[4757]: I1216 13:01:59.406917 4757 generic.go:334] "Generic (PLEG): container finished" podID="62758af3-0127-42e3-a06f-0ba9ff4452c4" containerID="52e4a7732d13d2c0ce13af8ce1df1711e743638d29497bd79dbfccce4074020a" exitCode=0 Dec 16 13:01:59 crc kubenswrapper[4757]: I1216 13:01:59.407025 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" event={"ID":"62758af3-0127-42e3-a06f-0ba9ff4452c4","Type":"ContainerDied","Data":"52e4a7732d13d2c0ce13af8ce1df1711e743638d29497bd79dbfccce4074020a"} Dec 16 13:01:59 crc kubenswrapper[4757]: I1216 13:01:59.407234 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" event={"ID":"62758af3-0127-42e3-a06f-0ba9ff4452c4","Type":"ContainerStarted","Data":"bea90ed64db9e604400e23e6ef71b6807cb49647fd14fb0b71cfd06e8ec29b04"} Dec 16 13:02:01 crc kubenswrapper[4757]: I1216 13:02:01.061818 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-btg7w"] Dec 16 13:02:01 crc kubenswrapper[4757]: I1216 13:02:01.063343 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btg7w" Dec 16 13:02:01 crc kubenswrapper[4757]: I1216 13:02:01.073596 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-btg7w"] Dec 16 13:02:01 crc kubenswrapper[4757]: I1216 13:02:01.204941 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h747m\" (UniqueName: \"kubernetes.io/projected/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8-kube-api-access-h747m\") pod \"community-operators-btg7w\" (UID: \"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8\") " pod="openshift-marketplace/community-operators-btg7w" Dec 16 13:02:01 crc kubenswrapper[4757]: I1216 13:02:01.205195 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8-catalog-content\") pod \"community-operators-btg7w\" (UID: \"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8\") " pod="openshift-marketplace/community-operators-btg7w" Dec 16 13:02:01 crc kubenswrapper[4757]: I1216 13:02:01.205245 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8-utilities\") pod \"community-operators-btg7w\" (UID: \"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8\") " pod="openshift-marketplace/community-operators-btg7w" Dec 16 13:02:01 crc kubenswrapper[4757]: I1216 13:02:01.326724 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8-catalog-content\") pod \"community-operators-btg7w\" (UID: \"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8\") " pod="openshift-marketplace/community-operators-btg7w" Dec 16 13:02:01 crc kubenswrapper[4757]: I1216 13:02:01.326784 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8-utilities\") pod \"community-operators-btg7w\" (UID: \"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8\") " pod="openshift-marketplace/community-operators-btg7w" Dec 16 13:02:01 crc kubenswrapper[4757]: I1216 13:02:01.326820 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h747m\" (UniqueName: \"kubernetes.io/projected/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8-kube-api-access-h747m\") pod \"community-operators-btg7w\" (UID: \"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8\") " pod="openshift-marketplace/community-operators-btg7w" Dec 16 13:02:01 crc kubenswrapper[4757]: I1216 13:02:01.327276 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8-catalog-content\") pod \"community-operators-btg7w\" (UID: \"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8\") " pod="openshift-marketplace/community-operators-btg7w" Dec 16 13:02:01 crc kubenswrapper[4757]: I1216 13:02:01.327347 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8-utilities\") pod \"community-operators-btg7w\" (UID: \"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8\") " pod="openshift-marketplace/community-operators-btg7w" Dec 16 13:02:01 crc kubenswrapper[4757]: I1216 13:02:01.350192 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h747m\" (UniqueName: \"kubernetes.io/projected/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8-kube-api-access-h747m\") pod \"community-operators-btg7w\" (UID: \"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8\") " pod="openshift-marketplace/community-operators-btg7w" Dec 16 13:02:01 crc kubenswrapper[4757]: I1216 13:02:01.387525 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btg7w" Dec 16 13:02:01 crc kubenswrapper[4757]: I1216 13:02:01.908670 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-btg7w"] Dec 16 13:02:01 crc kubenswrapper[4757]: W1216 13:02:01.952678 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8edb19e_a5a5_4af9_9e0c_6e745ac919e8.slice/crio-e7fd5217ce99400757e110820edd2482a85421030c6c4e3a08dcfca9b8b5dee3 WatchSource:0}: Error finding container e7fd5217ce99400757e110820edd2482a85421030c6c4e3a08dcfca9b8b5dee3: Status 404 returned error can't find the container with id e7fd5217ce99400757e110820edd2482a85421030c6c4e3a08dcfca9b8b5dee3 Dec 16 13:02:02 crc kubenswrapper[4757]: I1216 13:02:02.423972 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btg7w" event={"ID":"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8","Type":"ContainerStarted","Data":"e7fd5217ce99400757e110820edd2482a85421030c6c4e3a08dcfca9b8b5dee3"} Dec 16 13:02:03 crc kubenswrapper[4757]: I1216 13:02:03.431777 4757 generic.go:334] "Generic (PLEG): container finished" podID="62758af3-0127-42e3-a06f-0ba9ff4452c4" containerID="69522bd8e007b541d9615cfad76a90142d77a751a8f78faec0c5ebe601065d65" exitCode=0 Dec 16 13:02:03 crc kubenswrapper[4757]: I1216 13:02:03.431822 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" event={"ID":"62758af3-0127-42e3-a06f-0ba9ff4452c4","Type":"ContainerDied","Data":"69522bd8e007b541d9615cfad76a90142d77a751a8f78faec0c5ebe601065d65"} Dec 16 13:02:03 crc kubenswrapper[4757]: I1216 13:02:03.435619 4757 generic.go:334] "Generic (PLEG): container finished" podID="c8edb19e-a5a5-4af9-9e0c-6e745ac919e8" containerID="b315292e8173318c3d3f990fada0b084b2534996da01a53e12df29a7b5887d85" exitCode=0 Dec 16 13:02:03 crc kubenswrapper[4757]: I1216 13:02:03.435770 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btg7w" event={"ID":"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8","Type":"ContainerDied","Data":"b315292e8173318c3d3f990fada0b084b2534996da01a53e12df29a7b5887d85"} Dec 16 13:02:04 crc kubenswrapper[4757]: I1216 13:02:04.443519 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" event={"ID":"62758af3-0127-42e3-a06f-0ba9ff4452c4","Type":"ContainerStarted","Data":"2c5f00360aa979206e3cfa05a8741748086c998713b053d64dcbe875077da67c"} Dec 16 13:02:05 crc kubenswrapper[4757]: I1216 13:02:05.453481 4757 generic.go:334] "Generic (PLEG): container finished" podID="62758af3-0127-42e3-a06f-0ba9ff4452c4" containerID="2c5f00360aa979206e3cfa05a8741748086c998713b053d64dcbe875077da67c" exitCode=0 Dec 16 13:02:05 crc kubenswrapper[4757]: I1216 13:02:05.454213 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" event={"ID":"62758af3-0127-42e3-a06f-0ba9ff4452c4","Type":"ContainerDied","Data":"2c5f00360aa979206e3cfa05a8741748086c998713b053d64dcbe875077da67c"} Dec 16 13:02:05 crc kubenswrapper[4757]: I1216 13:02:05.455476 4757 generic.go:334] "Generic (PLEG): container finished" podID="c8edb19e-a5a5-4af9-9e0c-6e745ac919e8" containerID="8e24f74c89df2d4e8232014d000906bca61c7c009e60bd5840003f72005545df" exitCode=0 Dec 16 13:02:05 crc kubenswrapper[4757]: I1216 13:02:05.455573 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btg7w" event={"ID":"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8","Type":"ContainerDied","Data":"8e24f74c89df2d4e8232014d000906bca61c7c009e60bd5840003f72005545df"} Dec 16 13:02:06 crc kubenswrapper[4757]: I1216 13:02:06.703726 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" Dec 16 13:02:06 crc kubenswrapper[4757]: I1216 13:02:06.903218 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpjgb\" (UniqueName: \"kubernetes.io/projected/62758af3-0127-42e3-a06f-0ba9ff4452c4-kube-api-access-zpjgb\") pod \"62758af3-0127-42e3-a06f-0ba9ff4452c4\" (UID: \"62758af3-0127-42e3-a06f-0ba9ff4452c4\") " Dec 16 13:02:06 crc kubenswrapper[4757]: I1216 13:02:06.903368 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62758af3-0127-42e3-a06f-0ba9ff4452c4-util\") pod \"62758af3-0127-42e3-a06f-0ba9ff4452c4\" (UID: \"62758af3-0127-42e3-a06f-0ba9ff4452c4\") " Dec 16 13:02:06 crc kubenswrapper[4757]: I1216 13:02:06.903421 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62758af3-0127-42e3-a06f-0ba9ff4452c4-bundle\") pod \"62758af3-0127-42e3-a06f-0ba9ff4452c4\" (UID: \"62758af3-0127-42e3-a06f-0ba9ff4452c4\") " Dec 16 13:02:06 crc kubenswrapper[4757]: I1216 13:02:06.904230 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62758af3-0127-42e3-a06f-0ba9ff4452c4-bundle" (OuterVolumeSpecName: "bundle") pod "62758af3-0127-42e3-a06f-0ba9ff4452c4" (UID: "62758af3-0127-42e3-a06f-0ba9ff4452c4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:02:06 crc kubenswrapper[4757]: I1216 13:02:06.907895 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62758af3-0127-42e3-a06f-0ba9ff4452c4-kube-api-access-zpjgb" (OuterVolumeSpecName: "kube-api-access-zpjgb") pod "62758af3-0127-42e3-a06f-0ba9ff4452c4" (UID: "62758af3-0127-42e3-a06f-0ba9ff4452c4"). InnerVolumeSpecName "kube-api-access-zpjgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:02:06 crc kubenswrapper[4757]: I1216 13:02:06.916404 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62758af3-0127-42e3-a06f-0ba9ff4452c4-util" (OuterVolumeSpecName: "util") pod "62758af3-0127-42e3-a06f-0ba9ff4452c4" (UID: "62758af3-0127-42e3-a06f-0ba9ff4452c4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:02:07 crc kubenswrapper[4757]: I1216 13:02:07.005894 4757 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62758af3-0127-42e3-a06f-0ba9ff4452c4-util\") on node \"crc\" DevicePath \"\"" Dec 16 13:02:07 crc kubenswrapper[4757]: I1216 13:02:07.006244 4757 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62758af3-0127-42e3-a06f-0ba9ff4452c4-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:02:07 crc kubenswrapper[4757]: I1216 13:02:07.006343 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpjgb\" (UniqueName: \"kubernetes.io/projected/62758af3-0127-42e3-a06f-0ba9ff4452c4-kube-api-access-zpjgb\") on node \"crc\" DevicePath \"\"" Dec 16 13:02:07 crc kubenswrapper[4757]: I1216 13:02:07.494285 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" event={"ID":"62758af3-0127-42e3-a06f-0ba9ff4452c4","Type":"ContainerDied","Data":"bea90ed64db9e604400e23e6ef71b6807cb49647fd14fb0b71cfd06e8ec29b04"} Dec 16 13:02:07 crc kubenswrapper[4757]: I1216 13:02:07.494656 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bea90ed64db9e604400e23e6ef71b6807cb49647fd14fb0b71cfd06e8ec29b04" Dec 16 13:02:07 crc kubenswrapper[4757]: I1216 13:02:07.494304 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp" Dec 16 13:02:07 crc kubenswrapper[4757]: I1216 13:02:07.496534 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btg7w" event={"ID":"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8","Type":"ContainerStarted","Data":"7b51940f94242a597be3710ae28c955c6f0365a83cf3f9f61b8c9c5028a38820"} Dec 16 13:02:07 crc kubenswrapper[4757]: I1216 13:02:07.515553 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-btg7w" podStartSLOduration=3.521180808 podStartE2EDuration="6.515531131s" podCreationTimestamp="2025-12-16 13:02:01 +0000 UTC" firstStartedPulling="2025-12-16 13:02:03.437425842 +0000 UTC m=+908.865169638" lastFinishedPulling="2025-12-16 13:02:06.431776165 +0000 UTC m=+911.859519961" observedRunningTime="2025-12-16 13:02:07.513399928 +0000 UTC m=+912.941143724" watchObservedRunningTime="2025-12-16 13:02:07.515531131 +0000 UTC m=+912.943274917" Dec 16 13:02:08 crc kubenswrapper[4757]: I1216 13:02:08.828159 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56fbb56c9b-wtj5t"] Dec 16 13:02:08 crc kubenswrapper[4757]: E1216 13:02:08.828752 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62758af3-0127-42e3-a06f-0ba9ff4452c4" containerName="pull" Dec 16 13:02:08 crc kubenswrapper[4757]: I1216 13:02:08.828768 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="62758af3-0127-42e3-a06f-0ba9ff4452c4" containerName="pull" Dec 16 13:02:08 crc kubenswrapper[4757]: E1216 13:02:08.828791 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62758af3-0127-42e3-a06f-0ba9ff4452c4" containerName="extract" Dec 16 13:02:08 crc kubenswrapper[4757]: I1216 13:02:08.828798 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="62758af3-0127-42e3-a06f-0ba9ff4452c4" containerName="extract" Dec 16 13:02:08 crc kubenswrapper[4757]: E1216 13:02:08.828819 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62758af3-0127-42e3-a06f-0ba9ff4452c4" containerName="util" Dec 16 13:02:08 crc kubenswrapper[4757]: I1216 13:02:08.828826 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="62758af3-0127-42e3-a06f-0ba9ff4452c4" containerName="util" Dec 16 13:02:08 crc kubenswrapper[4757]: I1216 13:02:08.828965 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="62758af3-0127-42e3-a06f-0ba9ff4452c4" containerName="extract" Dec 16 13:02:08 crc kubenswrapper[4757]: I1216 13:02:08.829458 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-56fbb56c9b-wtj5t" Dec 16 13:02:08 crc kubenswrapper[4757]: I1216 13:02:08.836305 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-7bz95" Dec 16 13:02:08 crc kubenswrapper[4757]: I1216 13:02:08.862644 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56fbb56c9b-wtj5t"] Dec 16 13:02:09 crc kubenswrapper[4757]: I1216 13:02:09.031561 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt4b8\" (UniqueName: \"kubernetes.io/projected/4e6ed2fb-5d23-44a6-9967-6cc3ed88a1b0-kube-api-access-vt4b8\") pod \"openstack-operator-controller-operator-56fbb56c9b-wtj5t\" (UID: \"4e6ed2fb-5d23-44a6-9967-6cc3ed88a1b0\") " pod="openstack-operators/openstack-operator-controller-operator-56fbb56c9b-wtj5t" Dec 16 13:02:09 crc kubenswrapper[4757]: I1216 13:02:09.134122 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt4b8\" (UniqueName: \"kubernetes.io/projected/4e6ed2fb-5d23-44a6-9967-6cc3ed88a1b0-kube-api-access-vt4b8\") pod \"openstack-operator-controller-operator-56fbb56c9b-wtj5t\" (UID: \"4e6ed2fb-5d23-44a6-9967-6cc3ed88a1b0\") " pod="openstack-operators/openstack-operator-controller-operator-56fbb56c9b-wtj5t" Dec 16 13:02:09 crc kubenswrapper[4757]: I1216 13:02:09.160135 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt4b8\" (UniqueName: \"kubernetes.io/projected/4e6ed2fb-5d23-44a6-9967-6cc3ed88a1b0-kube-api-access-vt4b8\") pod \"openstack-operator-controller-operator-56fbb56c9b-wtj5t\" (UID: \"4e6ed2fb-5d23-44a6-9967-6cc3ed88a1b0\") " pod="openstack-operators/openstack-operator-controller-operator-56fbb56c9b-wtj5t" Dec 16 13:02:09 crc kubenswrapper[4757]: I1216 13:02:09.446941 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-56fbb56c9b-wtj5t" Dec 16 13:02:09 crc kubenswrapper[4757]: I1216 13:02:09.682968 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-56fbb56c9b-wtj5t"] Dec 16 13:02:10 crc kubenswrapper[4757]: I1216 13:02:10.514701 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-56fbb56c9b-wtj5t" event={"ID":"4e6ed2fb-5d23-44a6-9967-6cc3ed88a1b0","Type":"ContainerStarted","Data":"532c9497aa4501b29de443bf4e8a610e1978f218d921ea4efab785d17f49988a"} Dec 16 13:02:11 crc kubenswrapper[4757]: I1216 13:02:11.388255 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-btg7w" Dec 16 13:02:11 crc kubenswrapper[4757]: I1216 13:02:11.388649 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-btg7w" Dec 16 13:02:11 crc kubenswrapper[4757]: I1216 13:02:11.436314 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-btg7w" Dec 16 13:02:11 crc kubenswrapper[4757]: I1216 13:02:11.566120 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-btg7w" Dec 16 13:02:13 crc kubenswrapper[4757]: I1216 13:02:13.852049 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-btg7w"] Dec 16 13:02:13 crc kubenswrapper[4757]: I1216 13:02:13.852720 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-btg7w" podUID="c8edb19e-a5a5-4af9-9e0c-6e745ac919e8" containerName="registry-server" containerID="cri-o://7b51940f94242a597be3710ae28c955c6f0365a83cf3f9f61b8c9c5028a38820" gracePeriod=2 Dec 16 13:02:14 crc kubenswrapper[4757]: I1216 13:02:14.648998 4757 generic.go:334] "Generic (PLEG): container finished" podID="c8edb19e-a5a5-4af9-9e0c-6e745ac919e8" containerID="7b51940f94242a597be3710ae28c955c6f0365a83cf3f9f61b8c9c5028a38820" exitCode=0 Dec 16 13:02:14 crc kubenswrapper[4757]: I1216 13:02:14.649104 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btg7w" event={"ID":"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8","Type":"ContainerDied","Data":"7b51940f94242a597be3710ae28c955c6f0365a83cf3f9f61b8c9c5028a38820"} Dec 16 13:02:16 crc kubenswrapper[4757]: I1216 13:02:16.671917 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-btg7w" event={"ID":"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8","Type":"ContainerDied","Data":"e7fd5217ce99400757e110820edd2482a85421030c6c4e3a08dcfca9b8b5dee3"} Dec 16 13:02:16 crc kubenswrapper[4757]: I1216 13:02:16.672376 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7fd5217ce99400757e110820edd2482a85421030c6c4e3a08dcfca9b8b5dee3" Dec 16 13:02:16 crc kubenswrapper[4757]: I1216 13:02:16.679434 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btg7w" Dec 16 13:02:16 crc kubenswrapper[4757]: I1216 13:02:16.770771 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8-catalog-content\") pod \"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8\" (UID: \"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8\") " Dec 16 13:02:16 crc kubenswrapper[4757]: I1216 13:02:16.770822 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h747m\" (UniqueName: \"kubernetes.io/projected/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8-kube-api-access-h747m\") pod \"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8\" (UID: \"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8\") " Dec 16 13:02:16 crc kubenswrapper[4757]: I1216 13:02:16.770854 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8-utilities\") pod \"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8\" (UID: \"c8edb19e-a5a5-4af9-9e0c-6e745ac919e8\") " Dec 16 13:02:16 crc kubenswrapper[4757]: I1216 13:02:16.772087 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8-utilities" (OuterVolumeSpecName: "utilities") pod "c8edb19e-a5a5-4af9-9e0c-6e745ac919e8" (UID: "c8edb19e-a5a5-4af9-9e0c-6e745ac919e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:02:16 crc kubenswrapper[4757]: I1216 13:02:16.785426 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8-kube-api-access-h747m" (OuterVolumeSpecName: "kube-api-access-h747m") pod "c8edb19e-a5a5-4af9-9e0c-6e745ac919e8" (UID: "c8edb19e-a5a5-4af9-9e0c-6e745ac919e8"). InnerVolumeSpecName "kube-api-access-h747m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:02:16 crc kubenswrapper[4757]: I1216 13:02:16.820858 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8edb19e-a5a5-4af9-9e0c-6e745ac919e8" (UID: "c8edb19e-a5a5-4af9-9e0c-6e745ac919e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:02:16 crc kubenswrapper[4757]: I1216 13:02:16.871769 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:02:16 crc kubenswrapper[4757]: I1216 13:02:16.871993 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h747m\" (UniqueName: \"kubernetes.io/projected/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8-kube-api-access-h747m\") on node \"crc\" DevicePath \"\"" Dec 16 13:02:16 crc kubenswrapper[4757]: I1216 13:02:16.872081 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:02:17 crc kubenswrapper[4757]: I1216 13:02:17.679736 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-btg7w" Dec 16 13:02:17 crc kubenswrapper[4757]: I1216 13:02:17.680174 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-56fbb56c9b-wtj5t" event={"ID":"4e6ed2fb-5d23-44a6-9967-6cc3ed88a1b0","Type":"ContainerStarted","Data":"51b35fc63c07a5cb977caafac23e0e58fb29bebffc89b505ae0b7ab746a7c31c"} Dec 16 13:02:17 crc kubenswrapper[4757]: I1216 13:02:17.680235 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-56fbb56c9b-wtj5t" Dec 16 13:02:17 crc kubenswrapper[4757]: I1216 13:02:17.723248 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-56fbb56c9b-wtj5t" podStartSLOduration=2.675768682 podStartE2EDuration="9.72321146s" podCreationTimestamp="2025-12-16 13:02:08 +0000 UTC" firstStartedPulling="2025-12-16 13:02:09.692102916 +0000 UTC m=+915.119846712" lastFinishedPulling="2025-12-16 13:02:16.739545694 +0000 UTC m=+922.167289490" observedRunningTime="2025-12-16 13:02:17.709849059 +0000 UTC m=+923.137592895" watchObservedRunningTime="2025-12-16 13:02:17.72321146 +0000 UTC m=+923.150955306" Dec 16 13:02:17 crc kubenswrapper[4757]: I1216 13:02:17.738411 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-btg7w"] Dec 16 13:02:17 crc kubenswrapper[4757]: I1216 13:02:17.744635 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-btg7w"] Dec 16 13:02:18 crc kubenswrapper[4757]: I1216 13:02:18.956031 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8edb19e-a5a5-4af9-9e0c-6e745ac919e8" path="/var/lib/kubelet/pods/c8edb19e-a5a5-4af9-9e0c-6e745ac919e8/volumes" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.180814 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.180877 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.180927 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.181609 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2a23f8e521631b063ae4952d912ce6130192fc2c50ebd364a75b084a90f4b969"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.181682 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://2a23f8e521631b063ae4952d912ce6130192fc2c50ebd364a75b084a90f4b969" gracePeriod=600 Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.448734 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w7bvb"] Dec 16 13:02:21 crc kubenswrapper[4757]: E1216 13:02:21.449480 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8edb19e-a5a5-4af9-9e0c-6e745ac919e8" containerName="registry-server" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.449505 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8edb19e-a5a5-4af9-9e0c-6e745ac919e8" containerName="registry-server" Dec 16 13:02:21 crc kubenswrapper[4757]: E1216 13:02:21.449528 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8edb19e-a5a5-4af9-9e0c-6e745ac919e8" containerName="extract-content" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.449536 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8edb19e-a5a5-4af9-9e0c-6e745ac919e8" containerName="extract-content" Dec 16 13:02:21 crc kubenswrapper[4757]: E1216 13:02:21.449548 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8edb19e-a5a5-4af9-9e0c-6e745ac919e8" containerName="extract-utilities" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.449556 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8edb19e-a5a5-4af9-9e0c-6e745ac919e8" containerName="extract-utilities" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.449734 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8edb19e-a5a5-4af9-9e0c-6e745ac919e8" containerName="registry-server" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.450747 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7bvb" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.467416 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7bvb"] Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.533506 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9-utilities\") pod \"redhat-marketplace-w7bvb\" (UID: \"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9\") " pod="openshift-marketplace/redhat-marketplace-w7bvb" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.533562 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzxxl\" (UniqueName: \"kubernetes.io/projected/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9-kube-api-access-fzxxl\") pod \"redhat-marketplace-w7bvb\" (UID: \"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9\") " pod="openshift-marketplace/redhat-marketplace-w7bvb" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.533672 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9-catalog-content\") pod \"redhat-marketplace-w7bvb\" (UID: \"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9\") " pod="openshift-marketplace/redhat-marketplace-w7bvb" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.635037 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9-utilities\") pod \"redhat-marketplace-w7bvb\" (UID: \"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9\") " pod="openshift-marketplace/redhat-marketplace-w7bvb" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.635100 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzxxl\" (UniqueName: \"kubernetes.io/projected/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9-kube-api-access-fzxxl\") pod \"redhat-marketplace-w7bvb\" (UID: \"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9\") " pod="openshift-marketplace/redhat-marketplace-w7bvb" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.635195 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9-catalog-content\") pod \"redhat-marketplace-w7bvb\" (UID: \"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9\") " pod="openshift-marketplace/redhat-marketplace-w7bvb" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.635835 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9-utilities\") pod \"redhat-marketplace-w7bvb\" (UID: \"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9\") " pod="openshift-marketplace/redhat-marketplace-w7bvb" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.636037 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9-catalog-content\") pod \"redhat-marketplace-w7bvb\" (UID: \"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9\") " pod="openshift-marketplace/redhat-marketplace-w7bvb" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.657866 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzxxl\" (UniqueName: \"kubernetes.io/projected/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9-kube-api-access-fzxxl\") pod \"redhat-marketplace-w7bvb\" (UID: \"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9\") " pod="openshift-marketplace/redhat-marketplace-w7bvb" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.707951 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="2a23f8e521631b063ae4952d912ce6130192fc2c50ebd364a75b084a90f4b969" exitCode=0 Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.708022 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"2a23f8e521631b063ae4952d912ce6130192fc2c50ebd364a75b084a90f4b969"} Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.708054 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"a3d5810574004acc14ba78a28c621226c4b91fbf94cdc448f59cddf05b4cabae"} Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.708069 4757 scope.go:117] "RemoveContainer" containerID="8a775c60c16076b0ed545742e1f91801e3b31e7877ff7d29827c20b473cfd673" Dec 16 13:02:21 crc kubenswrapper[4757]: I1216 13:02:21.764660 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7bvb" Dec 16 13:02:22 crc kubenswrapper[4757]: I1216 13:02:22.228983 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7bvb"] Dec 16 13:02:22 crc kubenswrapper[4757]: I1216 13:02:22.721632 4757 generic.go:334] "Generic (PLEG): container finished" podID="2df475b4-c916-4e7f-be32-5fa6c4dd1cb9" containerID="845dfaa1e4a731db44d601b6342f8988e6d974b313bf72102c8604832625068c" exitCode=0 Dec 16 13:02:22 crc kubenswrapper[4757]: I1216 13:02:22.721723 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7bvb" event={"ID":"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9","Type":"ContainerDied","Data":"845dfaa1e4a731db44d601b6342f8988e6d974b313bf72102c8604832625068c"} Dec 16 13:02:22 crc kubenswrapper[4757]: I1216 13:02:22.721984 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7bvb" event={"ID":"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9","Type":"ContainerStarted","Data":"f9c6dfcacabbf6a4505927a9a8e4a652526ba944ee9c77fcb78fb2d177a0e362"} Dec 16 13:02:24 crc kubenswrapper[4757]: I1216 13:02:24.736579 4757 generic.go:334] "Generic (PLEG): container finished" podID="2df475b4-c916-4e7f-be32-5fa6c4dd1cb9" containerID="ab0bcd120d8c8653b56e998153417d44e82615f7b97b110575e90dc058c8af11" exitCode=0 Dec 16 13:02:24 crc kubenswrapper[4757]: I1216 13:02:24.736664 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7bvb" event={"ID":"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9","Type":"ContainerDied","Data":"ab0bcd120d8c8653b56e998153417d44e82615f7b97b110575e90dc058c8af11"} Dec 16 13:02:25 crc kubenswrapper[4757]: I1216 13:02:25.745043 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7bvb" event={"ID":"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9","Type":"ContainerStarted","Data":"9973b1600e3bcf4c366e92778b25499e85f859efe6c2a8a0d7c6d0e9688fc782"} Dec 16 13:02:25 crc kubenswrapper[4757]: I1216 13:02:25.779975 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w7bvb" podStartSLOduration=2.312232051 podStartE2EDuration="4.779951905s" podCreationTimestamp="2025-12-16 13:02:21 +0000 UTC" firstStartedPulling="2025-12-16 13:02:22.723846595 +0000 UTC m=+928.151590391" lastFinishedPulling="2025-12-16 13:02:25.191566449 +0000 UTC m=+930.619310245" observedRunningTime="2025-12-16 13:02:25.772822859 +0000 UTC m=+931.200566665" watchObservedRunningTime="2025-12-16 13:02:25.779951905 +0000 UTC m=+931.207695701" Dec 16 13:02:29 crc kubenswrapper[4757]: I1216 13:02:29.451519 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-56fbb56c9b-wtj5t" Dec 16 13:02:31 crc kubenswrapper[4757]: I1216 13:02:31.765544 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w7bvb" Dec 16 13:02:31 crc kubenswrapper[4757]: I1216 13:02:31.765924 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w7bvb" Dec 16 13:02:31 crc kubenswrapper[4757]: I1216 13:02:31.804812 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w7bvb" Dec 16 13:02:31 crc kubenswrapper[4757]: I1216 13:02:31.853299 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w7bvb" Dec 16 13:02:32 crc kubenswrapper[4757]: I1216 13:02:32.042296 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7bvb"] Dec 16 13:02:33 crc kubenswrapper[4757]: I1216 13:02:33.811307 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w7bvb" podUID="2df475b4-c916-4e7f-be32-5fa6c4dd1cb9" containerName="registry-server" containerID="cri-o://9973b1600e3bcf4c366e92778b25499e85f859efe6c2a8a0d7c6d0e9688fc782" gracePeriod=2 Dec 16 13:02:36 crc kubenswrapper[4757]: E1216 13:02:36.132121 4757 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2df475b4_c916_4e7f_be32_5fa6c4dd1cb9.slice/crio-conmon-9973b1600e3bcf4c366e92778b25499e85f859efe6c2a8a0d7c6d0e9688fc782.scope\": RecentStats: unable to find data in memory cache]" Dec 16 13:02:36 crc kubenswrapper[4757]: I1216 13:02:36.831511 4757 generic.go:334] "Generic (PLEG): container finished" podID="2df475b4-c916-4e7f-be32-5fa6c4dd1cb9" containerID="9973b1600e3bcf4c366e92778b25499e85f859efe6c2a8a0d7c6d0e9688fc782" exitCode=0 Dec 16 13:02:36 crc kubenswrapper[4757]: I1216 13:02:36.831563 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7bvb" event={"ID":"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9","Type":"ContainerDied","Data":"9973b1600e3bcf4c366e92778b25499e85f859efe6c2a8a0d7c6d0e9688fc782"} Dec 16 13:02:37 crc kubenswrapper[4757]: I1216 13:02:37.946173 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7bvb" Dec 16 13:02:37 crc kubenswrapper[4757]: I1216 13:02:37.976522 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9-utilities\") pod \"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9\" (UID: \"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9\") " Dec 16 13:02:37 crc kubenswrapper[4757]: I1216 13:02:37.976598 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9-catalog-content\") pod \"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9\" (UID: \"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9\") " Dec 16 13:02:37 crc kubenswrapper[4757]: I1216 13:02:37.976629 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzxxl\" (UniqueName: \"kubernetes.io/projected/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9-kube-api-access-fzxxl\") pod \"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9\" (UID: \"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9\") " Dec 16 13:02:37 crc kubenswrapper[4757]: I1216 13:02:37.977683 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9-utilities" (OuterVolumeSpecName: "utilities") pod "2df475b4-c916-4e7f-be32-5fa6c4dd1cb9" (UID: "2df475b4-c916-4e7f-be32-5fa6c4dd1cb9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:02:38 crc kubenswrapper[4757]: I1216 13:02:37.998695 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9-kube-api-access-fzxxl" (OuterVolumeSpecName: "kube-api-access-fzxxl") pod "2df475b4-c916-4e7f-be32-5fa6c4dd1cb9" (UID: "2df475b4-c916-4e7f-be32-5fa6c4dd1cb9"). InnerVolumeSpecName "kube-api-access-fzxxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:02:38 crc kubenswrapper[4757]: I1216 13:02:38.009018 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2df475b4-c916-4e7f-be32-5fa6c4dd1cb9" (UID: "2df475b4-c916-4e7f-be32-5fa6c4dd1cb9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:02:38 crc kubenswrapper[4757]: I1216 13:02:38.078502 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:02:38 crc kubenswrapper[4757]: I1216 13:02:38.078535 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:02:38 crc kubenswrapper[4757]: I1216 13:02:38.078547 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzxxl\" (UniqueName: \"kubernetes.io/projected/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9-kube-api-access-fzxxl\") on node \"crc\" DevicePath \"\"" Dec 16 13:02:38 crc kubenswrapper[4757]: I1216 13:02:38.844502 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7bvb" event={"ID":"2df475b4-c916-4e7f-be32-5fa6c4dd1cb9","Type":"ContainerDied","Data":"f9c6dfcacabbf6a4505927a9a8e4a652526ba944ee9c77fcb78fb2d177a0e362"} Dec 16 13:02:38 crc kubenswrapper[4757]: I1216 13:02:38.845055 4757 scope.go:117] "RemoveContainer" containerID="9973b1600e3bcf4c366e92778b25499e85f859efe6c2a8a0d7c6d0e9688fc782" Dec 16 13:02:38 crc kubenswrapper[4757]: I1216 13:02:38.844610 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7bvb" Dec 16 13:02:38 crc kubenswrapper[4757]: I1216 13:02:38.864242 4757 scope.go:117] "RemoveContainer" containerID="ab0bcd120d8c8653b56e998153417d44e82615f7b97b110575e90dc058c8af11" Dec 16 13:02:38 crc kubenswrapper[4757]: I1216 13:02:38.891119 4757 scope.go:117] "RemoveContainer" containerID="845dfaa1e4a731db44d601b6342f8988e6d974b313bf72102c8604832625068c" Dec 16 13:02:38 crc kubenswrapper[4757]: I1216 13:02:38.898946 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7bvb"] Dec 16 13:02:38 crc kubenswrapper[4757]: I1216 13:02:38.902986 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7bvb"] Dec 16 13:02:38 crc kubenswrapper[4757]: I1216 13:02:38.967816 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2df475b4-c916-4e7f-be32-5fa6c4dd1cb9" path="/var/lib/kubelet/pods/2df475b4-c916-4e7f-be32-5fa6c4dd1cb9/volumes" Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.875349 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-95949466-sgvcj"] Dec 16 13:02:45 crc kubenswrapper[4757]: E1216 13:02:45.876215 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df475b4-c916-4e7f-be32-5fa6c4dd1cb9" containerName="extract-utilities" Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.876233 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df475b4-c916-4e7f-be32-5fa6c4dd1cb9" containerName="extract-utilities" Dec 16 13:02:45 crc kubenswrapper[4757]: E1216 13:02:45.876249 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df475b4-c916-4e7f-be32-5fa6c4dd1cb9" containerName="extract-content" Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.876257 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df475b4-c916-4e7f-be32-5fa6c4dd1cb9" containerName="extract-content" Dec 16 13:02:45 crc kubenswrapper[4757]: E1216 13:02:45.876276 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df475b4-c916-4e7f-be32-5fa6c4dd1cb9" containerName="registry-server" Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.876284 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df475b4-c916-4e7f-be32-5fa6c4dd1cb9" containerName="registry-server" Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.876428 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df475b4-c916-4e7f-be32-5fa6c4dd1cb9" containerName="registry-server" Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.876917 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-95949466-sgvcj" Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.882027 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-99h4n" Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.882980 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz"] Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.883946 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz" Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.887596 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-w8l4x" Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.893935 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz"] Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.905213 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-95949466-sgvcj"] Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.925530 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8"] Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.926524 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8" Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.931542 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-lgxq7" Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.935240 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-767f9d7567-qtkjq"] Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.936346 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-qtkjq" Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.938113 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-j7lkt" Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.944511 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-59b8dcb766-lvxk6"] Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.945414 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-lvxk6" Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.954773 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-t4x6b" Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.979933 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8"] Dec 16 13:02:45 crc kubenswrapper[4757]: I1216 13:02:45.994301 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-767f9d7567-qtkjq"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.003410 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbtlm\" (UniqueName: \"kubernetes.io/projected/3717fd56-4339-4ad6-940d-b5023c76d32f-kube-api-access-jbtlm\") pod \"cinder-operator-controller-manager-5f98b4754f-4z9jz\" (UID: \"3717fd56-4339-4ad6-940d-b5023c76d32f\") " pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.003594 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86cpb\" (UniqueName: \"kubernetes.io/projected/34c17eba-d6e6-4399-a0a0-f25ef7a89fb9-kube-api-access-86cpb\") pod \"barbican-operator-controller-manager-95949466-sgvcj\" (UID: \"34c17eba-d6e6-4399-a0a0-f25ef7a89fb9\") " pod="openstack-operators/barbican-operator-controller-manager-95949466-sgvcj" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.003631 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngl8w\" (UniqueName: \"kubernetes.io/projected/a6449c1f-3695-445d-90b0-64b4c79cde05-kube-api-access-ngl8w\") pod \"designate-operator-controller-manager-66f8b87655-t9vm8\" (UID: \"a6449c1f-3695-445d-90b0-64b4c79cde05\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.003666 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9c64\" (UniqueName: \"kubernetes.io/projected/b46e5138-a221-489d-9d7a-a54cf3938d64-kube-api-access-b9c64\") pod \"heat-operator-controller-manager-59b8dcb766-lvxk6\" (UID: \"b46e5138-a221-489d-9d7a-a54cf3938d64\") " pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-lvxk6" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.003703 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj8bf\" (UniqueName: \"kubernetes.io/projected/fbd5f746-9483-455c-988e-2e882623d09e-kube-api-access-pj8bf\") pod \"glance-operator-controller-manager-767f9d7567-qtkjq\" (UID: \"fbd5f746-9483-455c-988e-2e882623d09e\") " pod="openstack-operators/glance-operator-controller-manager-767f9d7567-qtkjq" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.061021 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccf486b9-w9gps"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.061960 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-w9gps" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.066956 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-dx9fd" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.075488 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-84b495f78-fw274"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.081770 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-84b495f78-fw274" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.099019 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-wgxpn" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.099302 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.117651 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccf486b9-w9gps"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.118525 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbtlm\" (UniqueName: \"kubernetes.io/projected/3717fd56-4339-4ad6-940d-b5023c76d32f-kube-api-access-jbtlm\") pod \"cinder-operator-controller-manager-5f98b4754f-4z9jz\" (UID: \"3717fd56-4339-4ad6-940d-b5023c76d32f\") " pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.118740 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86cpb\" (UniqueName: \"kubernetes.io/projected/34c17eba-d6e6-4399-a0a0-f25ef7a89fb9-kube-api-access-86cpb\") pod \"barbican-operator-controller-manager-95949466-sgvcj\" (UID: \"34c17eba-d6e6-4399-a0a0-f25ef7a89fb9\") " pod="openstack-operators/barbican-operator-controller-manager-95949466-sgvcj" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.118918 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngl8w\" (UniqueName: \"kubernetes.io/projected/a6449c1f-3695-445d-90b0-64b4c79cde05-kube-api-access-ngl8w\") pod \"designate-operator-controller-manager-66f8b87655-t9vm8\" (UID: \"a6449c1f-3695-445d-90b0-64b4c79cde05\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.119065 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9c64\" (UniqueName: \"kubernetes.io/projected/b46e5138-a221-489d-9d7a-a54cf3938d64-kube-api-access-b9c64\") pod \"heat-operator-controller-manager-59b8dcb766-lvxk6\" (UID: \"b46e5138-a221-489d-9d7a-a54cf3938d64\") " pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-lvxk6" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.119142 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj8bf\" (UniqueName: \"kubernetes.io/projected/fbd5f746-9483-455c-988e-2e882623d09e-kube-api-access-pj8bf\") pod \"glance-operator-controller-manager-767f9d7567-qtkjq\" (UID: \"fbd5f746-9483-455c-988e-2e882623d09e\") " pod="openstack-operators/glance-operator-controller-manager-767f9d7567-qtkjq" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.182526 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-59b8dcb766-lvxk6"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.189549 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngl8w\" (UniqueName: \"kubernetes.io/projected/a6449c1f-3695-445d-90b0-64b4c79cde05-kube-api-access-ngl8w\") pod \"designate-operator-controller-manager-66f8b87655-t9vm8\" (UID: \"a6449c1f-3695-445d-90b0-64b4c79cde05\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.206334 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-84b495f78-fw274"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.211825 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.212688 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.219212 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9c64\" (UniqueName: \"kubernetes.io/projected/b46e5138-a221-489d-9d7a-a54cf3938d64-kube-api-access-b9c64\") pod \"heat-operator-controller-manager-59b8dcb766-lvxk6\" (UID: \"b46e5138-a221-489d-9d7a-a54cf3938d64\") " pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-lvxk6" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.221864 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c7cbf548f-wgrph"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.229669 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86cpb\" (UniqueName: \"kubernetes.io/projected/34c17eba-d6e6-4399-a0a0-f25ef7a89fb9-kube-api-access-86cpb\") pod \"barbican-operator-controller-manager-95949466-sgvcj\" (UID: \"34c17eba-d6e6-4399-a0a0-f25ef7a89fb9\") " pod="openstack-operators/barbican-operator-controller-manager-95949466-sgvcj" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.230313 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbtlm\" (UniqueName: \"kubernetes.io/projected/3717fd56-4339-4ad6-940d-b5023c76d32f-kube-api-access-jbtlm\") pod \"cinder-operator-controller-manager-5f98b4754f-4z9jz\" (UID: \"3717fd56-4339-4ad6-940d-b5023c76d32f\") " pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.232222 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-mr42m" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.244951 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj8bf\" (UniqueName: \"kubernetes.io/projected/fbd5f746-9483-455c-988e-2e882623d09e-kube-api-access-pj8bf\") pod \"glance-operator-controller-manager-767f9d7567-qtkjq\" (UID: \"fbd5f746-9483-455c-988e-2e882623d09e\") " pod="openstack-operators/glance-operator-controller-manager-767f9d7567-qtkjq" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.245228 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.246309 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6qrh\" (UniqueName: \"kubernetes.io/projected/6c815add-abbd-4655-b257-d50ab074414a-kube-api-access-m6qrh\") pod \"horizon-operator-controller-manager-6ccf486b9-w9gps\" (UID: \"6c815add-abbd-4655-b257-d50ab074414a\") " pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-w9gps" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.246349 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9j68\" (UniqueName: \"kubernetes.io/projected/904525e7-6f82-4fbf-928a-99194a97829a-kube-api-access-v9j68\") pod \"infra-operator-controller-manager-84b495f78-fw274\" (UID: \"904525e7-6f82-4fbf-928a-99194a97829a\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-fw274" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.246377 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/904525e7-6f82-4fbf-928a-99194a97829a-cert\") pod \"infra-operator-controller-manager-84b495f78-fw274\" (UID: \"904525e7-6f82-4fbf-928a-99194a97829a\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-fw274" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.265645 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-qtkjq" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.277477 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-lvxk6" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.305107 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fdd9786f7-wg8wq"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.305846 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-wgrph" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.306313 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.306411 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-wg8wq" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.310954 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f76f4954c-psxvw"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.312143 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-psxvw" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.325672 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c7cbf548f-wgrph"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.337623 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-vn7k7" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.337790 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-j2cb9" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.338237 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-cz7k9" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.338654 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fdd9786f7-wg8wq"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.338727 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f76f4954c-psxvw"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.353084 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.353851 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-bqfgm"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.354499 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-bqfgm" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.355123 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.358356 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26hws\" (UniqueName: \"kubernetes.io/projected/6333c537-0505-48c0-b197-a609084a2a2c-kube-api-access-26hws\") pod \"ironic-operator-controller-manager-f458558d7-45bgz\" (UID: \"6333c537-0505-48c0-b197-a609084a2a2c\") " pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.358504 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6qrh\" (UniqueName: \"kubernetes.io/projected/6c815add-abbd-4655-b257-d50ab074414a-kube-api-access-m6qrh\") pod \"horizon-operator-controller-manager-6ccf486b9-w9gps\" (UID: \"6c815add-abbd-4655-b257-d50ab074414a\") " pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-w9gps" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.358566 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9j68\" (UniqueName: \"kubernetes.io/projected/904525e7-6f82-4fbf-928a-99194a97829a-kube-api-access-v9j68\") pod \"infra-operator-controller-manager-84b495f78-fw274\" (UID: \"904525e7-6f82-4fbf-928a-99194a97829a\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-fw274" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.358607 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/904525e7-6f82-4fbf-928a-99194a97829a-cert\") pod \"infra-operator-controller-manager-84b495f78-fw274\" (UID: \"904525e7-6f82-4fbf-928a-99194a97829a\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-fw274" Dec 16 13:02:46 crc kubenswrapper[4757]: E1216 13:02:46.358931 4757 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 13:02:46 crc kubenswrapper[4757]: E1216 13:02:46.359031 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/904525e7-6f82-4fbf-928a-99194a97829a-cert podName:904525e7-6f82-4fbf-928a-99194a97829a nodeName:}" failed. No retries permitted until 2025-12-16 13:02:46.858983863 +0000 UTC m=+952.286727659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/904525e7-6f82-4fbf-928a-99194a97829a-cert") pod "infra-operator-controller-manager-84b495f78-fw274" (UID: "904525e7-6f82-4fbf-928a-99194a97829a") : secret "infra-operator-webhook-server-cert" not found Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.362432 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.363221 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.366994 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-bqfgm"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.370981 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.371354 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-wspgs" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.374059 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-6h6lj" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.376445 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-x72bf"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.379207 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-scprn" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.382991 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-x72bf" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.386727 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.389940 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.390679 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.402551 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.403603 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-h2znj" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.403834 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-x72bf"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.420388 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9j68\" (UniqueName: \"kubernetes.io/projected/904525e7-6f82-4fbf-928a-99194a97829a-kube-api-access-v9j68\") pod \"infra-operator-controller-manager-84b495f78-fw274\" (UID: \"904525e7-6f82-4fbf-928a-99194a97829a\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-fw274" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.427702 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-8dr49" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.451809 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6qrh\" (UniqueName: \"kubernetes.io/projected/6c815add-abbd-4655-b257-d50ab074414a-kube-api-access-m6qrh\") pod \"horizon-operator-controller-manager-6ccf486b9-w9gps\" (UID: \"6c815add-abbd-4655-b257-d50ab074414a\") " pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-w9gps" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.464772 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdqcs\" (UniqueName: \"kubernetes.io/projected/545806dc-d916-4704-bc27-f5a46915fb56-kube-api-access-wdqcs\") pod \"neutron-operator-controller-manager-7cd87b778f-v7jvg\" (UID: \"545806dc-d916-4704-bc27-f5a46915fb56\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.464847 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26hws\" (UniqueName: \"kubernetes.io/projected/6333c537-0505-48c0-b197-a609084a2a2c-kube-api-access-26hws\") pod \"ironic-operator-controller-manager-f458558d7-45bgz\" (UID: \"6333c537-0505-48c0-b197-a609084a2a2c\") " pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.464885 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdv2q\" (UniqueName: \"kubernetes.io/projected/eb0b72d5-d126-4513-8fbb-c7eed0a8f5e0-kube-api-access-pdv2q\") pod \"manila-operator-controller-manager-5fdd9786f7-wg8wq\" (UID: \"eb0b72d5-d126-4513-8fbb-c7eed0a8f5e0\") " pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-wg8wq" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.464921 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7wk6\" (UniqueName: \"kubernetes.io/projected/e9f15431-d8cd-408d-8169-e06457cabccc-kube-api-access-x7wk6\") pod \"octavia-operator-controller-manager-68c649d9d-dxgr9\" (UID: \"e9f15431-d8cd-408d-8169-e06457cabccc\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.464968 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bxtf\" (UniqueName: \"kubernetes.io/projected/75d829d5-a3cd-48c6-8aff-07f7d325b4f9-kube-api-access-4bxtf\") pod \"mariadb-operator-controller-manager-f76f4954c-psxvw\" (UID: \"75d829d5-a3cd-48c6-8aff-07f7d325b4f9\") " pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-psxvw" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.465023 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs9rw\" (UniqueName: \"kubernetes.io/projected/72a6aea3-2309-4c98-802b-416feed1ba0f-kube-api-access-qs9rw\") pod \"keystone-operator-controller-manager-5c7cbf548f-wgrph\" (UID: \"72a6aea3-2309-4c98-802b-416feed1ba0f\") " pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-wgrph" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.465065 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2skq8\" (UniqueName: \"kubernetes.io/projected/83154b06-c2df-4a44-9a33-4971cd60add3-kube-api-access-2skq8\") pod \"nova-operator-controller-manager-5fbbf8b6cc-bqfgm\" (UID: \"83154b06-c2df-4a44-9a33-4971cd60add3\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-bqfgm" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.467415 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8665b56d78-bk25g"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.468518 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-bk25g" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.473492 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-4fvjn" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.506168 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26hws\" (UniqueName: \"kubernetes.io/projected/6333c537-0505-48c0-b197-a609084a2a2c-kube-api-access-26hws\") pod \"ironic-operator-controller-manager-f458558d7-45bgz\" (UID: \"6333c537-0505-48c0-b197-a609084a2a2c\") " pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.508132 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8665b56d78-bk25g"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.508574 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.524485 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-95949466-sgvcj" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.545664 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.561529 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c6df8f9-wgxxx"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.562295 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-wgxxx" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.570090 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdv2q\" (UniqueName: \"kubernetes.io/projected/eb0b72d5-d126-4513-8fbb-c7eed0a8f5e0-kube-api-access-pdv2q\") pod \"manila-operator-controller-manager-5fdd9786f7-wg8wq\" (UID: \"eb0b72d5-d126-4513-8fbb-c7eed0a8f5e0\") " pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-wg8wq" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.570152 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7wk6\" (UniqueName: \"kubernetes.io/projected/e9f15431-d8cd-408d-8169-e06457cabccc-kube-api-access-x7wk6\") pod \"octavia-operator-controller-manager-68c649d9d-dxgr9\" (UID: \"e9f15431-d8cd-408d-8169-e06457cabccc\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.570207 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9\" (UID: \"67c1a6c7-35d1-48a9-a058-13e5d5599fe7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.570243 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bxtf\" (UniqueName: \"kubernetes.io/projected/75d829d5-a3cd-48c6-8aff-07f7d325b4f9-kube-api-access-4bxtf\") pod \"mariadb-operator-controller-manager-f76f4954c-psxvw\" (UID: \"75d829d5-a3cd-48c6-8aff-07f7d325b4f9\") " pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-psxvw" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.570264 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62jwf\" (UniqueName: \"kubernetes.io/projected/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-kube-api-access-62jwf\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9\" (UID: \"67c1a6c7-35d1-48a9-a058-13e5d5599fe7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.570289 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwdw8\" (UniqueName: \"kubernetes.io/projected/28ec7b61-2e0c-4ad7-8569-eeb5973b976d-kube-api-access-rwdw8\") pod \"placement-operator-controller-manager-8665b56d78-bk25g\" (UID: \"28ec7b61-2e0c-4ad7-8569-eeb5973b976d\") " pod="openstack-operators/placement-operator-controller-manager-8665b56d78-bk25g" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.570323 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs9rw\" (UniqueName: \"kubernetes.io/projected/72a6aea3-2309-4c98-802b-416feed1ba0f-kube-api-access-qs9rw\") pod \"keystone-operator-controller-manager-5c7cbf548f-wgrph\" (UID: \"72a6aea3-2309-4c98-802b-416feed1ba0f\") " pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-wgrph" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.570353 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl4qj\" (UniqueName: \"kubernetes.io/projected/60821702-232d-4eb4-b70f-15e87e070aed-kube-api-access-rl4qj\") pod \"ovn-operator-controller-manager-bf6d4f946-x72bf\" (UID: \"60821702-232d-4eb4-b70f-15e87e070aed\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-x72bf" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.570388 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2skq8\" (UniqueName: \"kubernetes.io/projected/83154b06-c2df-4a44-9a33-4971cd60add3-kube-api-access-2skq8\") pod \"nova-operator-controller-manager-5fbbf8b6cc-bqfgm\" (UID: \"83154b06-c2df-4a44-9a33-4971cd60add3\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-bqfgm" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.570431 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdqcs\" (UniqueName: \"kubernetes.io/projected/545806dc-d916-4704-bc27-f5a46915fb56-kube-api-access-wdqcs\") pod \"neutron-operator-controller-manager-7cd87b778f-v7jvg\" (UID: \"545806dc-d916-4704-bc27-f5a46915fb56\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.573655 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-b5vjx" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.575600 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-97d456b9-85zkh"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.582101 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-85zkh" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.608392 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-jwdmh" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.620948 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7wk6\" (UniqueName: \"kubernetes.io/projected/e9f15431-d8cd-408d-8169-e06457cabccc-kube-api-access-x7wk6\") pod \"octavia-operator-controller-manager-68c649d9d-dxgr9\" (UID: \"e9f15431-d8cd-408d-8169-e06457cabccc\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.622859 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.626238 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bxtf\" (UniqueName: \"kubernetes.io/projected/75d829d5-a3cd-48c6-8aff-07f7d325b4f9-kube-api-access-4bxtf\") pod \"mariadb-operator-controller-manager-f76f4954c-psxvw\" (UID: \"75d829d5-a3cd-48c6-8aff-07f7d325b4f9\") " pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-psxvw" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.633699 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs9rw\" (UniqueName: \"kubernetes.io/projected/72a6aea3-2309-4c98-802b-416feed1ba0f-kube-api-access-qs9rw\") pod \"keystone-operator-controller-manager-5c7cbf548f-wgrph\" (UID: \"72a6aea3-2309-4c98-802b-416feed1ba0f\") " pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-wgrph" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.646830 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2skq8\" (UniqueName: \"kubernetes.io/projected/83154b06-c2df-4a44-9a33-4971cd60add3-kube-api-access-2skq8\") pod \"nova-operator-controller-manager-5fbbf8b6cc-bqfgm\" (UID: \"83154b06-c2df-4a44-9a33-4971cd60add3\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-bqfgm" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.647806 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c6df8f9-wgxxx"] Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.869714 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-w9gps" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.887310 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.899202 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-bqfgm" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.902437 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-psxvw" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.923869 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdv2q\" (UniqueName: \"kubernetes.io/projected/eb0b72d5-d126-4513-8fbb-c7eed0a8f5e0-kube-api-access-pdv2q\") pod \"manila-operator-controller-manager-5fdd9786f7-wg8wq\" (UID: \"eb0b72d5-d126-4513-8fbb-c7eed0a8f5e0\") " pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-wg8wq" Dec 16 13:02:46 crc kubenswrapper[4757]: I1216 13:02:46.924600 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdqcs\" (UniqueName: \"kubernetes.io/projected/545806dc-d916-4704-bc27-f5a46915fb56-kube-api-access-wdqcs\") pod \"neutron-operator-controller-manager-7cd87b778f-v7jvg\" (UID: \"545806dc-d916-4704-bc27-f5a46915fb56\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.025333 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-wg8wq" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.026897 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-wgrph" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.027864 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl4qj\" (UniqueName: \"kubernetes.io/projected/60821702-232d-4eb4-b70f-15e87e070aed-kube-api-access-rl4qj\") pod \"ovn-operator-controller-manager-bf6d4f946-x72bf\" (UID: \"60821702-232d-4eb4-b70f-15e87e070aed\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-x72bf" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.028033 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slwp5\" (UniqueName: \"kubernetes.io/projected/120aab20-c2fb-441d-9c07-bd05c0678a11-kube-api-access-slwp5\") pod \"swift-operator-controller-manager-5c6df8f9-wgxxx\" (UID: \"120aab20-c2fb-441d-9c07-bd05c0678a11\") " pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-wgxxx" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.028086 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9\" (UID: \"67c1a6c7-35d1-48a9-a058-13e5d5599fe7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.028125 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/904525e7-6f82-4fbf-928a-99194a97829a-cert\") pod \"infra-operator-controller-manager-84b495f78-fw274\" (UID: \"904525e7-6f82-4fbf-928a-99194a97829a\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-fw274" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.028154 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62jwf\" (UniqueName: \"kubernetes.io/projected/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-kube-api-access-62jwf\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9\" (UID: \"67c1a6c7-35d1-48a9-a058-13e5d5599fe7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.028185 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwdw8\" (UniqueName: \"kubernetes.io/projected/28ec7b61-2e0c-4ad7-8569-eeb5973b976d-kube-api-access-rwdw8\") pod \"placement-operator-controller-manager-8665b56d78-bk25g\" (UID: \"28ec7b61-2e0c-4ad7-8569-eeb5973b976d\") " pod="openstack-operators/placement-operator-controller-manager-8665b56d78-bk25g" Dec 16 13:02:47 crc kubenswrapper[4757]: E1216 13:02:47.028698 4757 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 13:02:47 crc kubenswrapper[4757]: E1216 13:02:47.028751 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-cert podName:67c1a6c7-35d1-48a9-a058-13e5d5599fe7 nodeName:}" failed. No retries permitted until 2025-12-16 13:02:47.528733528 +0000 UTC m=+952.956477324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-cert") pod "openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" (UID: "67c1a6c7-35d1-48a9-a058-13e5d5599fe7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 13:02:47 crc kubenswrapper[4757]: E1216 13:02:47.031623 4757 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 13:02:47 crc kubenswrapper[4757]: E1216 13:02:47.080223 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/904525e7-6f82-4fbf-928a-99194a97829a-cert podName:904525e7-6f82-4fbf-928a-99194a97829a nodeName:}" failed. No retries permitted until 2025-12-16 13:02:48.031685811 +0000 UTC m=+953.459429617 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/904525e7-6f82-4fbf-928a-99194a97829a-cert") pod "infra-operator-controller-manager-84b495f78-fw274" (UID: "904525e7-6f82-4fbf-928a-99194a97829a") : secret "infra-operator-webhook-server-cert" not found Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.080660 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.104241 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwdw8\" (UniqueName: \"kubernetes.io/projected/28ec7b61-2e0c-4ad7-8569-eeb5973b976d-kube-api-access-rwdw8\") pod \"placement-operator-controller-manager-8665b56d78-bk25g\" (UID: \"28ec7b61-2e0c-4ad7-8569-eeb5973b976d\") " pod="openstack-operators/placement-operator-controller-manager-8665b56d78-bk25g" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.108617 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl4qj\" (UniqueName: \"kubernetes.io/projected/60821702-232d-4eb4-b70f-15e87e070aed-kube-api-access-rl4qj\") pod \"ovn-operator-controller-manager-bf6d4f946-x72bf\" (UID: \"60821702-232d-4eb4-b70f-15e87e070aed\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-x72bf" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.131899 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62jwf\" (UniqueName: \"kubernetes.io/projected/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-kube-api-access-62jwf\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9\" (UID: \"67c1a6c7-35d1-48a9-a058-13e5d5599fe7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.132069 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slwp5\" (UniqueName: \"kubernetes.io/projected/120aab20-c2fb-441d-9c07-bd05c0678a11-kube-api-access-slwp5\") pod \"swift-operator-controller-manager-5c6df8f9-wgxxx\" (UID: \"120aab20-c2fb-441d-9c07-bd05c0678a11\") " pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-wgxxx" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.133521 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf2jh\" (UniqueName: \"kubernetes.io/projected/ac2d53dd-c297-44b1-bcb1-a3025530eb5c-kube-api-access-tf2jh\") pod \"telemetry-operator-controller-manager-97d456b9-85zkh\" (UID: \"ac2d53dd-c297-44b1-bcb1-a3025530eb5c\") " pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-85zkh" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.158653 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slwp5\" (UniqueName: \"kubernetes.io/projected/120aab20-c2fb-441d-9c07-bd05c0678a11-kube-api-access-slwp5\") pod \"swift-operator-controller-manager-5c6df8f9-wgxxx\" (UID: \"120aab20-c2fb-441d-9c07-bd05c0678a11\") " pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-wgxxx" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.175338 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-bk25g" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.209348 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-97d456b9-85zkh"] Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.226602 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8"] Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.227752 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.260734 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8"] Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.261706 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf2jh\" (UniqueName: \"kubernetes.io/projected/ac2d53dd-c297-44b1-bcb1-a3025530eb5c-kube-api-access-tf2jh\") pod \"telemetry-operator-controller-manager-97d456b9-85zkh\" (UID: \"ac2d53dd-c297-44b1-bcb1-a3025530eb5c\") " pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-85zkh" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.270082 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dr2qv"] Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.271223 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dr2qv" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.288206 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-x72bf" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.288594 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dr2qv"] Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.288647 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k"] Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.289546 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.305363 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k"] Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.311117 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcxgg"] Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.311956 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcxgg" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.332146 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcxgg"] Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.332719 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-9dxq5" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.332884 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-d6tmx" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.333360 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.333657 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.333865 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-hwbjr" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.334023 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-j586v" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.352141 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-wgxxx" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.352809 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf2jh\" (UniqueName: \"kubernetes.io/projected/ac2d53dd-c297-44b1-bcb1-a3025530eb5c-kube-api-access-tf2jh\") pod \"telemetry-operator-controller-manager-97d456b9-85zkh\" (UID: \"ac2d53dd-c297-44b1-bcb1-a3025530eb5c\") " pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-85zkh" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.368165 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zd72\" (UniqueName: \"kubernetes.io/projected/ee4f2b54-1f7b-469d-9d41-ad4d57c3bf89-kube-api-access-9zd72\") pod \"watcher-operator-controller-manager-55f78b7c4c-dr2qv\" (UID: \"ee4f2b54-1f7b-469d-9d41-ad4d57c3bf89\") " pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dr2qv" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.368223 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74g2v\" (UniqueName: \"kubernetes.io/projected/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-kube-api-access-74g2v\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.368321 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnspl\" (UniqueName: \"kubernetes.io/projected/42d952f0-a650-484d-9e6b-b1c6c0f252dc-kube-api-access-gnspl\") pod \"test-operator-controller-manager-756ccf86c7-w6kx8\" (UID: \"42d952f0-a650-484d-9e6b-b1c6c0f252dc\") " pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.368343 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.368379 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-metrics-certs\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.575510 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-metrics-certs\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.575876 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9\" (UID: \"67c1a6c7-35d1-48a9-a058-13e5d5599fe7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.575994 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zd72\" (UniqueName: \"kubernetes.io/projected/ee4f2b54-1f7b-469d-9d41-ad4d57c3bf89-kube-api-access-9zd72\") pod \"watcher-operator-controller-manager-55f78b7c4c-dr2qv\" (UID: \"ee4f2b54-1f7b-469d-9d41-ad4d57c3bf89\") " pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dr2qv" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.576121 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88f8x\" (UniqueName: \"kubernetes.io/projected/7432087f-983f-4b3d-af98-40238ceba951-kube-api-access-88f8x\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gcxgg\" (UID: \"7432087f-983f-4b3d-af98-40238ceba951\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcxgg" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.576228 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74g2v\" (UniqueName: \"kubernetes.io/projected/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-kube-api-access-74g2v\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.576402 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnspl\" (UniqueName: \"kubernetes.io/projected/42d952f0-a650-484d-9e6b-b1c6c0f252dc-kube-api-access-gnspl\") pod \"test-operator-controller-manager-756ccf86c7-w6kx8\" (UID: \"42d952f0-a650-484d-9e6b-b1c6c0f252dc\") " pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.576506 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:02:47 crc kubenswrapper[4757]: E1216 13:02:47.575809 4757 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 13:02:47 crc kubenswrapper[4757]: E1216 13:02:47.576761 4757 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 13:02:47 crc kubenswrapper[4757]: E1216 13:02:47.576809 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs podName:7b6693c4-d7ad-4edc-ba55-baa2fea5094a nodeName:}" failed. No retries permitted until 2025-12-16 13:02:48.076792249 +0000 UTC m=+953.504536045 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs") pod "openstack-operator-controller-manager-554cfb9dfb-d6w2k" (UID: "7b6693c4-d7ad-4edc-ba55-baa2fea5094a") : secret "webhook-server-cert" not found Dec 16 13:02:47 crc kubenswrapper[4757]: E1216 13:02:47.577030 4757 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 13:02:47 crc kubenswrapper[4757]: E1216 13:02:47.577096 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-cert podName:67c1a6c7-35d1-48a9-a058-13e5d5599fe7 nodeName:}" failed. No retries permitted until 2025-12-16 13:02:48.577055385 +0000 UTC m=+954.004799181 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-cert") pod "openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" (UID: "67c1a6c7-35d1-48a9-a058-13e5d5599fe7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 13:02:47 crc kubenswrapper[4757]: E1216 13:02:47.577129 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-metrics-certs podName:7b6693c4-d7ad-4edc-ba55-baa2fea5094a nodeName:}" failed. No retries permitted until 2025-12-16 13:02:48.077120607 +0000 UTC m=+953.504864403 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-metrics-certs") pod "openstack-operator-controller-manager-554cfb9dfb-d6w2k" (UID: "7b6693c4-d7ad-4edc-ba55-baa2fea5094a") : secret "metrics-server-cert" not found Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.610071 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnspl\" (UniqueName: \"kubernetes.io/projected/42d952f0-a650-484d-9e6b-b1c6c0f252dc-kube-api-access-gnspl\") pod \"test-operator-controller-manager-756ccf86c7-w6kx8\" (UID: \"42d952f0-a650-484d-9e6b-b1c6c0f252dc\") " pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.611434 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74g2v\" (UniqueName: \"kubernetes.io/projected/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-kube-api-access-74g2v\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.695915 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zd72\" (UniqueName: \"kubernetes.io/projected/ee4f2b54-1f7b-469d-9d41-ad4d57c3bf89-kube-api-access-9zd72\") pod \"watcher-operator-controller-manager-55f78b7c4c-dr2qv\" (UID: \"ee4f2b54-1f7b-469d-9d41-ad4d57c3bf89\") " pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dr2qv" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.700340 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88f8x\" (UniqueName: \"kubernetes.io/projected/7432087f-983f-4b3d-af98-40238ceba951-kube-api-access-88f8x\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gcxgg\" (UID: \"7432087f-983f-4b3d-af98-40238ceba951\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcxgg" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.804964 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-85zkh" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.864352 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.881107 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dr2qv" Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.939994 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-767f9d7567-qtkjq"] Dec 16 13:02:47 crc kubenswrapper[4757]: I1216 13:02:47.968470 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88f8x\" (UniqueName: \"kubernetes.io/projected/7432087f-983f-4b3d-af98-40238ceba951-kube-api-access-88f8x\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gcxgg\" (UID: \"7432087f-983f-4b3d-af98-40238ceba951\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcxgg" Dec 16 13:02:48 crc kubenswrapper[4757]: I1216 13:02:48.119134 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:02:48 crc kubenswrapper[4757]: I1216 13:02:48.119209 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-metrics-certs\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:02:48 crc kubenswrapper[4757]: I1216 13:02:48.119260 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/904525e7-6f82-4fbf-928a-99194a97829a-cert\") pod \"infra-operator-controller-manager-84b495f78-fw274\" (UID: \"904525e7-6f82-4fbf-928a-99194a97829a\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-fw274" Dec 16 13:02:48 crc kubenswrapper[4757]: E1216 13:02:48.120134 4757 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 13:02:48 crc kubenswrapper[4757]: E1216 13:02:48.120177 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/904525e7-6f82-4fbf-928a-99194a97829a-cert podName:904525e7-6f82-4fbf-928a-99194a97829a nodeName:}" failed. No retries permitted until 2025-12-16 13:02:50.120162784 +0000 UTC m=+955.547906580 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/904525e7-6f82-4fbf-928a-99194a97829a-cert") pod "infra-operator-controller-manager-84b495f78-fw274" (UID: "904525e7-6f82-4fbf-928a-99194a97829a") : secret "infra-operator-webhook-server-cert" not found Dec 16 13:02:48 crc kubenswrapper[4757]: E1216 13:02:48.120260 4757 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 13:02:48 crc kubenswrapper[4757]: E1216 13:02:48.120313 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs podName:7b6693c4-d7ad-4edc-ba55-baa2fea5094a nodeName:}" failed. No retries permitted until 2025-12-16 13:02:49.120297247 +0000 UTC m=+954.548041043 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs") pod "openstack-operator-controller-manager-554cfb9dfb-d6w2k" (UID: "7b6693c4-d7ad-4edc-ba55-baa2fea5094a") : secret "webhook-server-cert" not found Dec 16 13:02:48 crc kubenswrapper[4757]: E1216 13:02:48.120580 4757 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 13:02:48 crc kubenswrapper[4757]: E1216 13:02:48.120648 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-metrics-certs podName:7b6693c4-d7ad-4edc-ba55-baa2fea5094a nodeName:}" failed. No retries permitted until 2025-12-16 13:02:49.120626265 +0000 UTC m=+954.548370121 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-metrics-certs") pod "openstack-operator-controller-manager-554cfb9dfb-d6w2k" (UID: "7b6693c4-d7ad-4edc-ba55-baa2fea5094a") : secret "metrics-server-cert" not found Dec 16 13:02:48 crc kubenswrapper[4757]: I1216 13:02:48.235617 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcxgg" Dec 16 13:02:49 crc kubenswrapper[4757]: I1216 13:02:49.010950 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9\" (UID: \"67c1a6c7-35d1-48a9-a058-13e5d5599fe7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:02:49 crc kubenswrapper[4757]: E1216 13:02:49.011178 4757 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 13:02:49 crc kubenswrapper[4757]: E1216 13:02:49.011221 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-cert podName:67c1a6c7-35d1-48a9-a058-13e5d5599fe7 nodeName:}" failed. No retries permitted until 2025-12-16 13:02:51.011207902 +0000 UTC m=+956.438951698 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-cert") pod "openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" (UID: "67c1a6c7-35d1-48a9-a058-13e5d5599fe7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 13:02:49 crc kubenswrapper[4757]: I1216 13:02:49.219930 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:02:49 crc kubenswrapper[4757]: I1216 13:02:49.220033 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-metrics-certs\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:02:49 crc kubenswrapper[4757]: E1216 13:02:49.226668 4757 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 13:02:49 crc kubenswrapper[4757]: E1216 13:02:49.226743 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-metrics-certs podName:7b6693c4-d7ad-4edc-ba55-baa2fea5094a nodeName:}" failed. No retries permitted until 2025-12-16 13:02:51.226721773 +0000 UTC m=+956.654465569 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-metrics-certs") pod "openstack-operator-controller-manager-554cfb9dfb-d6w2k" (UID: "7b6693c4-d7ad-4edc-ba55-baa2fea5094a") : secret "metrics-server-cert" not found Dec 16 13:02:49 crc kubenswrapper[4757]: E1216 13:02:49.235623 4757 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 13:02:49 crc kubenswrapper[4757]: E1216 13:02:49.236110 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs podName:7b6693c4-d7ad-4edc-ba55-baa2fea5094a nodeName:}" failed. No retries permitted until 2025-12-16 13:02:51.235813078 +0000 UTC m=+956.663556874 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs") pod "openstack-operator-controller-manager-554cfb9dfb-d6w2k" (UID: "7b6693c4-d7ad-4edc-ba55-baa2fea5094a") : secret "webhook-server-cert" not found Dec 16 13:02:49 crc kubenswrapper[4757]: I1216 13:02:49.259363 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-qtkjq" event={"ID":"fbd5f746-9483-455c-988e-2e882623d09e","Type":"ContainerStarted","Data":"f5bffae6971d4456911d3162246a9c865364195955ee31d7c173efcbdf6797cb"} Dec 16 13:02:49 crc kubenswrapper[4757]: I1216 13:02:49.290060 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9"] Dec 16 13:02:49 crc kubenswrapper[4757]: I1216 13:02:49.315331 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz"] Dec 16 13:02:49 crc kubenswrapper[4757]: I1216 13:02:49.384956 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-59b8dcb766-lvxk6"] Dec 16 13:02:49 crc kubenswrapper[4757]: I1216 13:02:49.398426 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8"] Dec 16 13:02:49 crc kubenswrapper[4757]: I1216 13:02:49.402335 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-95949466-sgvcj"] Dec 16 13:02:49 crc kubenswrapper[4757]: I1216 13:02:49.458786 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-bqfgm"] Dec 16 13:02:49 crc kubenswrapper[4757]: W1216 13:02:49.470153 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb46e5138_a221_489d_9d7a_a54cf3938d64.slice/crio-c148df39be921112fe1dda9b62ad2dbfbaf57cccd9c9db0c60bf66a79e738e33 WatchSource:0}: Error finding container c148df39be921112fe1dda9b62ad2dbfbaf57cccd9c9db0c60bf66a79e738e33: Status 404 returned error can't find the container with id c148df39be921112fe1dda9b62ad2dbfbaf57cccd9c9db0c60bf66a79e738e33 Dec 16 13:02:49 crc kubenswrapper[4757]: I1216 13:02:49.523271 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f76f4954c-psxvw"] Dec 16 13:02:49 crc kubenswrapper[4757]: I1216 13:02:49.830482 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c7cbf548f-wgrph"] Dec 16 13:02:49 crc kubenswrapper[4757]: W1216 13:02:49.838800 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72a6aea3_2309_4c98_802b_416feed1ba0f.slice/crio-8421b3f9c7ee251d5e1cfd490e25fd9a4d836e6d18f66defa0c1f7d2d956d876 WatchSource:0}: Error finding container 8421b3f9c7ee251d5e1cfd490e25fd9a4d836e6d18f66defa0c1f7d2d956d876: Status 404 returned error can't find the container with id 8421b3f9c7ee251d5e1cfd490e25fd9a4d836e6d18f66defa0c1f7d2d956d876 Dec 16 13:02:49 crc kubenswrapper[4757]: I1216 13:02:49.936104 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccf486b9-w9gps"] Dec 16 13:02:49 crc kubenswrapper[4757]: W1216 13:02:49.951646 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c815add_abbd_4655_b257_d50ab074414a.slice/crio-63c16fa0645ffccc2ee4ab61818cd694f6743732d5a40b311a2a618cdc4c4541 WatchSource:0}: Error finding container 63c16fa0645ffccc2ee4ab61818cd694f6743732d5a40b311a2a618cdc4c4541: Status 404 returned error can't find the container with id 63c16fa0645ffccc2ee4ab61818cd694f6743732d5a40b311a2a618cdc4c4541 Dec 16 13:02:49 crc kubenswrapper[4757]: I1216 13:02:49.964359 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-x72bf"] Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.028187 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fdd9786f7-wg8wq"] Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.036546 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz"] Dec 16 13:02:50 crc kubenswrapper[4757]: W1216 13:02:50.048975 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb0b72d5_d126_4513_8fbb_c7eed0a8f5e0.slice/crio-a2376b4acf37db570439c0fde3020652952e188368e1d68a927897baa58d3195 WatchSource:0}: Error finding container a2376b4acf37db570439c0fde3020652952e188368e1d68a927897baa58d3195: Status 404 returned error can't find the container with id a2376b4acf37db570439c0fde3020652952e188368e1d68a927897baa58d3195 Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.074417 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c6df8f9-wgxxx"] Dec 16 13:02:50 crc kubenswrapper[4757]: W1216 13:02:50.093882 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod120aab20_c2fb_441d_9c07_bd05c0678a11.slice/crio-3e52aec48a707ba91d65de044c4ff9eda25942ce9fedbc15534afaf9617950ce WatchSource:0}: Error finding container 3e52aec48a707ba91d65de044c4ff9eda25942ce9fedbc15534afaf9617950ce: Status 404 returned error can't find the container with id 3e52aec48a707ba91d65de044c4ff9eda25942ce9fedbc15534afaf9617950ce Dec 16 13:02:50 crc kubenswrapper[4757]: W1216 13:02:50.094672 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7432087f_983f_4b3d_af98_40238ceba951.slice/crio-a8b3124c362ca7dccfabc53556e192c4d9ac79092e0f4466e473bf610d453aee WatchSource:0}: Error finding container a8b3124c362ca7dccfabc53556e192c4d9ac79092e0f4466e473bf610d453aee: Status 404 returned error can't find the container with id a8b3124c362ca7dccfabc53556e192c4d9ac79092e0f4466e473bf610d453aee Dec 16 13:02:50 crc kubenswrapper[4757]: W1216 13:02:50.097166 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee4f2b54_1f7b_469d_9d41_ad4d57c3bf89.slice/crio-97a99da79176befff2cdc0d0098d38d9c3db8e1996441289a6237e9c81fbb443 WatchSource:0}: Error finding container 97a99da79176befff2cdc0d0098d38d9c3db8e1996441289a6237e9c81fbb443: Status 404 returned error can't find the container with id 97a99da79176befff2cdc0d0098d38d9c3db8e1996441289a6237e9c81fbb443 Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.097204 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcxgg"] Dec 16 13:02:50 crc kubenswrapper[4757]: E1216 13:02:50.101126 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-88f8x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-gcxgg_openstack-operators(7432087f-983f-4b3d-af98-40238ceba951): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 13:02:50 crc kubenswrapper[4757]: E1216 13:02:50.102357 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcxgg" podUID="7432087f-983f-4b3d-af98-40238ceba951" Dec 16 13:02:50 crc kubenswrapper[4757]: E1216 13:02:50.104560 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-slwp5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5c6df8f9-wgxxx_openstack-operators(120aab20-c2fb-441d-9c07-bd05c0678a11): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 13:02:50 crc kubenswrapper[4757]: E1216 13:02:50.106409 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-wgxxx" podUID="120aab20-c2fb-441d-9c07-bd05c0678a11" Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.109570 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dr2qv"] Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.149061 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/904525e7-6f82-4fbf-928a-99194a97829a-cert\") pod \"infra-operator-controller-manager-84b495f78-fw274\" (UID: \"904525e7-6f82-4fbf-928a-99194a97829a\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-fw274" Dec 16 13:02:50 crc kubenswrapper[4757]: E1216 13:02:50.149377 4757 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 13:02:50 crc kubenswrapper[4757]: E1216 13:02:50.149637 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/904525e7-6f82-4fbf-928a-99194a97829a-cert podName:904525e7-6f82-4fbf-928a-99194a97829a nodeName:}" failed. No retries permitted until 2025-12-16 13:02:54.149547606 +0000 UTC m=+959.577291402 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/904525e7-6f82-4fbf-928a-99194a97829a-cert") pod "infra-operator-controller-manager-84b495f78-fw274" (UID: "904525e7-6f82-4fbf-928a-99194a97829a") : secret "infra-operator-webhook-server-cert" not found Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.245143 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-97d456b9-85zkh"] Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.266709 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8"] Dec 16 13:02:50 crc kubenswrapper[4757]: E1216 13:02:50.267902 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wdqcs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-7cd87b778f-v7jvg_openstack-operators(545806dc-d916-4704-bc27-f5a46915fb56): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 13:02:50 crc kubenswrapper[4757]: E1216 13:02:50.269067 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg" podUID="545806dc-d916-4704-bc27-f5a46915fb56" Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.273944 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg" event={"ID":"545806dc-d916-4704-bc27-f5a46915fb56","Type":"ContainerStarted","Data":"90f12b0e8e56d532525e8013e0a0943da762548c2bcfed715d4b2ee5ce6348fe"} Dec 16 13:02:50 crc kubenswrapper[4757]: E1216 13:02:50.275725 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg" podUID="545806dc-d916-4704-bc27-f5a46915fb56" Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.276872 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-w9gps" event={"ID":"6c815add-abbd-4655-b257-d50ab074414a","Type":"ContainerStarted","Data":"63c16fa0645ffccc2ee4ab61818cd694f6743732d5a40b311a2a618cdc4c4541"} Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.286687 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg"] Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.295393 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8665b56d78-bk25g"] Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.302531 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-psxvw" event={"ID":"75d829d5-a3cd-48c6-8aff-07f7d325b4f9","Type":"ContainerStarted","Data":"a38308ef9a5c07aac11dfb2a537096cf094dbf29c397c52709052390045298bf"} Dec 16 13:02:50 crc kubenswrapper[4757]: W1216 13:02:50.307228 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28ec7b61_2e0c_4ad7_8569_eeb5973b976d.slice/crio-a7f5c8c781f8e36218e895abf20f2a0841a698d03c60922cdc8aea5dfeb1db5b WatchSource:0}: Error finding container a7f5c8c781f8e36218e895abf20f2a0841a698d03c60922cdc8aea5dfeb1db5b: Status 404 returned error can't find the container with id a7f5c8c781f8e36218e895abf20f2a0841a698d03c60922cdc8aea5dfeb1db5b Dec 16 13:02:50 crc kubenswrapper[4757]: E1216 13:02:50.307568 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gnspl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-756ccf86c7-w6kx8_openstack-operators(42d952f0-a650-484d-9e6b-b1c6c0f252dc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 13:02:50 crc kubenswrapper[4757]: E1216 13:02:50.309739 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podUID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.341333 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-wgrph" event={"ID":"72a6aea3-2309-4c98-802b-416feed1ba0f","Type":"ContainerStarted","Data":"8421b3f9c7ee251d5e1cfd490e25fd9a4d836e6d18f66defa0c1f7d2d956d876"} Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.372033 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-wg8wq" event={"ID":"eb0b72d5-d126-4513-8fbb-c7eed0a8f5e0","Type":"ContainerStarted","Data":"a2376b4acf37db570439c0fde3020652952e188368e1d68a927897baa58d3195"} Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.373634 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dr2qv" event={"ID":"ee4f2b54-1f7b-469d-9d41-ad4d57c3bf89","Type":"ContainerStarted","Data":"97a99da79176befff2cdc0d0098d38d9c3db8e1996441289a6237e9c81fbb443"} Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.374848 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8" event={"ID":"a6449c1f-3695-445d-90b0-64b4c79cde05","Type":"ContainerStarted","Data":"83c1aa06390e11698cd7daffa5b6593a3d7635b2bc39e20dfe3b39036dfb4ed6"} Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.375780 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" event={"ID":"e9f15431-d8cd-408d-8169-e06457cabccc","Type":"ContainerStarted","Data":"cbe357cf38370129af219ac464bc1f5586b97149720139dde8c88c6343bb920c"} Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.403598 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-lvxk6" event={"ID":"b46e5138-a221-489d-9d7a-a54cf3938d64","Type":"ContainerStarted","Data":"c148df39be921112fe1dda9b62ad2dbfbaf57cccd9c9db0c60bf66a79e738e33"} Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.407249 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-85zkh" event={"ID":"ac2d53dd-c297-44b1-bcb1-a3025530eb5c","Type":"ContainerStarted","Data":"a2bebad8caed42f9dd07b729335c8b40930dfde0626761dfb614407ea990f8c2"} Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.416782 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-x72bf" event={"ID":"60821702-232d-4eb4-b70f-15e87e070aed","Type":"ContainerStarted","Data":"33ea5733819bf619cd27f6df6375bcddb7f33956a465164002b2004f6a55a08b"} Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.420478 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcxgg" event={"ID":"7432087f-983f-4b3d-af98-40238ceba951","Type":"ContainerStarted","Data":"a8b3124c362ca7dccfabc53556e192c4d9ac79092e0f4466e473bf610d453aee"} Dec 16 13:02:50 crc kubenswrapper[4757]: E1216 13:02:50.437494 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcxgg" podUID="7432087f-983f-4b3d-af98-40238ceba951" Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.442318 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz" event={"ID":"3717fd56-4339-4ad6-940d-b5023c76d32f","Type":"ContainerStarted","Data":"6369949c641a4c361af17648542bc1c7749f3e0418e1328caf9eadaec96f4d66"} Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.456400 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" event={"ID":"6333c537-0505-48c0-b197-a609084a2a2c","Type":"ContainerStarted","Data":"a48c6e5c6af1e9f9a427697668cb98a1f7ba55e739f51b17d38f7f26321ffd90"} Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.487045 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-wgxxx" event={"ID":"120aab20-c2fb-441d-9c07-bd05c0678a11","Type":"ContainerStarted","Data":"3e52aec48a707ba91d65de044c4ff9eda25942ce9fedbc15534afaf9617950ce"} Dec 16 13:02:50 crc kubenswrapper[4757]: E1216 13:02:50.500257 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-wgxxx" podUID="120aab20-c2fb-441d-9c07-bd05c0678a11" Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.517960 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-95949466-sgvcj" event={"ID":"34c17eba-d6e6-4399-a0a0-f25ef7a89fb9","Type":"ContainerStarted","Data":"0205f36768ada274dd002a0765e4a34a9740384650569aa02a2c4dc0069b16ce"} Dec 16 13:02:50 crc kubenswrapper[4757]: I1216 13:02:50.535166 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-bqfgm" event={"ID":"83154b06-c2df-4a44-9a33-4971cd60add3","Type":"ContainerStarted","Data":"ad655b18352456f5b58ee24690d2aed4fbc62ccf574a0839f7643ab05b71e168"} Dec 16 13:02:51 crc kubenswrapper[4757]: I1216 13:02:51.063241 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9\" (UID: \"67c1a6c7-35d1-48a9-a058-13e5d5599fe7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:02:51 crc kubenswrapper[4757]: E1216 13:02:51.063408 4757 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 13:02:51 crc kubenswrapper[4757]: E1216 13:02:51.063458 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-cert podName:67c1a6c7-35d1-48a9-a058-13e5d5599fe7 nodeName:}" failed. No retries permitted until 2025-12-16 13:02:55.063440798 +0000 UTC m=+960.491184594 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-cert") pod "openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" (UID: "67c1a6c7-35d1-48a9-a058-13e5d5599fe7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 13:02:51 crc kubenswrapper[4757]: I1216 13:02:51.266684 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:02:51 crc kubenswrapper[4757]: I1216 13:02:51.266760 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-metrics-certs\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:02:51 crc kubenswrapper[4757]: E1216 13:02:51.266884 4757 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 13:02:51 crc kubenswrapper[4757]: E1216 13:02:51.266907 4757 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 13:02:51 crc kubenswrapper[4757]: E1216 13:02:51.266962 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs podName:7b6693c4-d7ad-4edc-ba55-baa2fea5094a nodeName:}" failed. No retries permitted until 2025-12-16 13:02:55.266940302 +0000 UTC m=+960.694684188 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs") pod "openstack-operator-controller-manager-554cfb9dfb-d6w2k" (UID: "7b6693c4-d7ad-4edc-ba55-baa2fea5094a") : secret "webhook-server-cert" not found Dec 16 13:02:51 crc kubenswrapper[4757]: E1216 13:02:51.266998 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-metrics-certs podName:7b6693c4-d7ad-4edc-ba55-baa2fea5094a nodeName:}" failed. No retries permitted until 2025-12-16 13:02:55.266974453 +0000 UTC m=+960.694718369 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-metrics-certs") pod "openstack-operator-controller-manager-554cfb9dfb-d6w2k" (UID: "7b6693c4-d7ad-4edc-ba55-baa2fea5094a") : secret "metrics-server-cert" not found Dec 16 13:02:51 crc kubenswrapper[4757]: I1216 13:02:51.603732 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" event={"ID":"42d952f0-a650-484d-9e6b-b1c6c0f252dc","Type":"ContainerStarted","Data":"04606d9d2debd8b4875f6dd62c27ced1f01d3e2189dedd411231bdfb8503c561"} Dec 16 13:02:51 crc kubenswrapper[4757]: E1216 13:02:51.609054 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podUID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" Dec 16 13:02:51 crc kubenswrapper[4757]: I1216 13:02:51.615308 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-bk25g" event={"ID":"28ec7b61-2e0c-4ad7-8569-eeb5973b976d","Type":"ContainerStarted","Data":"a7f5c8c781f8e36218e895abf20f2a0841a698d03c60922cdc8aea5dfeb1db5b"} Dec 16 13:02:51 crc kubenswrapper[4757]: E1216 13:02:51.618450 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-wgxxx" podUID="120aab20-c2fb-441d-9c07-bd05c0678a11" Dec 16 13:02:51 crc kubenswrapper[4757]: E1216 13:02:51.618528 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcxgg" podUID="7432087f-983f-4b3d-af98-40238ceba951" Dec 16 13:02:51 crc kubenswrapper[4757]: E1216 13:02:51.622807 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg" podUID="545806dc-d916-4704-bc27-f5a46915fb56" Dec 16 13:02:52 crc kubenswrapper[4757]: E1216 13:02:52.701374 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podUID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" Dec 16 13:02:52 crc kubenswrapper[4757]: E1216 13:02:52.701553 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg" podUID="545806dc-d916-4704-bc27-f5a46915fb56" Dec 16 13:02:54 crc kubenswrapper[4757]: I1216 13:02:54.221642 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/904525e7-6f82-4fbf-928a-99194a97829a-cert\") pod \"infra-operator-controller-manager-84b495f78-fw274\" (UID: \"904525e7-6f82-4fbf-928a-99194a97829a\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-fw274" Dec 16 13:02:54 crc kubenswrapper[4757]: E1216 13:02:54.221851 4757 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 13:02:54 crc kubenswrapper[4757]: E1216 13:02:54.222126 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/904525e7-6f82-4fbf-928a-99194a97829a-cert podName:904525e7-6f82-4fbf-928a-99194a97829a nodeName:}" failed. No retries permitted until 2025-12-16 13:03:02.22209246 +0000 UTC m=+967.649836256 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/904525e7-6f82-4fbf-928a-99194a97829a-cert") pod "infra-operator-controller-manager-84b495f78-fw274" (UID: "904525e7-6f82-4fbf-928a-99194a97829a") : secret "infra-operator-webhook-server-cert" not found Dec 16 13:02:55 crc kubenswrapper[4757]: I1216 13:02:55.142943 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9\" (UID: \"67c1a6c7-35d1-48a9-a058-13e5d5599fe7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:02:55 crc kubenswrapper[4757]: E1216 13:02:55.143401 4757 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 13:02:55 crc kubenswrapper[4757]: E1216 13:02:55.143451 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-cert podName:67c1a6c7-35d1-48a9-a058-13e5d5599fe7 nodeName:}" failed. No retries permitted until 2025-12-16 13:03:03.143434776 +0000 UTC m=+968.571178572 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-cert") pod "openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" (UID: "67c1a6c7-35d1-48a9-a058-13e5d5599fe7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 13:02:55 crc kubenswrapper[4757]: I1216 13:02:55.355054 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:02:55 crc kubenswrapper[4757]: I1216 13:02:55.355118 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-metrics-certs\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:02:55 crc kubenswrapper[4757]: E1216 13:02:55.355194 4757 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 13:02:55 crc kubenswrapper[4757]: E1216 13:02:55.355215 4757 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 13:02:55 crc kubenswrapper[4757]: E1216 13:02:55.355245 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs podName:7b6693c4-d7ad-4edc-ba55-baa2fea5094a nodeName:}" failed. No retries permitted until 2025-12-16 13:03:03.355228675 +0000 UTC m=+968.782972471 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs") pod "openstack-operator-controller-manager-554cfb9dfb-d6w2k" (UID: "7b6693c4-d7ad-4edc-ba55-baa2fea5094a") : secret "webhook-server-cert" not found Dec 16 13:02:55 crc kubenswrapper[4757]: E1216 13:02:55.355258 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-metrics-certs podName:7b6693c4-d7ad-4edc-ba55-baa2fea5094a nodeName:}" failed. No retries permitted until 2025-12-16 13:03:03.355252575 +0000 UTC m=+968.782996361 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-metrics-certs") pod "openstack-operator-controller-manager-554cfb9dfb-d6w2k" (UID: "7b6693c4-d7ad-4edc-ba55-baa2fea5094a") : secret "metrics-server-cert" not found Dec 16 13:03:02 crc kubenswrapper[4757]: I1216 13:03:02.277937 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/904525e7-6f82-4fbf-928a-99194a97829a-cert\") pod \"infra-operator-controller-manager-84b495f78-fw274\" (UID: \"904525e7-6f82-4fbf-928a-99194a97829a\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-fw274" Dec 16 13:03:02 crc kubenswrapper[4757]: I1216 13:03:02.289588 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/904525e7-6f82-4fbf-928a-99194a97829a-cert\") pod \"infra-operator-controller-manager-84b495f78-fw274\" (UID: \"904525e7-6f82-4fbf-928a-99194a97829a\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-fw274" Dec 16 13:03:02 crc kubenswrapper[4757]: I1216 13:03:02.362114 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-wgxpn" Dec 16 13:03:02 crc kubenswrapper[4757]: I1216 13:03:02.371167 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-84b495f78-fw274" Dec 16 13:03:03 crc kubenswrapper[4757]: I1216 13:03:03.189994 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9\" (UID: \"67c1a6c7-35d1-48a9-a058-13e5d5599fe7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:03:03 crc kubenswrapper[4757]: I1216 13:03:03.194226 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67c1a6c7-35d1-48a9-a058-13e5d5599fe7-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9\" (UID: \"67c1a6c7-35d1-48a9-a058-13e5d5599fe7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:03:03 crc kubenswrapper[4757]: I1216 13:03:03.347728 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-8dr49" Dec 16 13:03:03 crc kubenswrapper[4757]: I1216 13:03:03.356262 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:03:03 crc kubenswrapper[4757]: I1216 13:03:03.392738 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-metrics-certs\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:03:03 crc kubenswrapper[4757]: I1216 13:03:03.392842 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:03:03 crc kubenswrapper[4757]: E1216 13:03:03.392920 4757 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 13:03:03 crc kubenswrapper[4757]: E1216 13:03:03.392973 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs podName:7b6693c4-d7ad-4edc-ba55-baa2fea5094a nodeName:}" failed. No retries permitted until 2025-12-16 13:03:19.392955721 +0000 UTC m=+984.820699507 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs") pod "openstack-operator-controller-manager-554cfb9dfb-d6w2k" (UID: "7b6693c4-d7ad-4edc-ba55-baa2fea5094a") : secret "webhook-server-cert" not found Dec 16 13:03:03 crc kubenswrapper[4757]: I1216 13:03:03.395847 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-metrics-certs\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:03:06 crc kubenswrapper[4757]: E1216 13:03:06.844288 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 16 13:03:06 crc kubenswrapper[4757]: E1216 13:03:06.844823 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x7wk6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-68c649d9d-dxgr9_openstack-operators(e9f15431-d8cd-408d-8169-e06457cabccc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:03:06 crc kubenswrapper[4757]: E1216 13:03:06.846281 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" podUID="e9f15431-d8cd-408d-8169-e06457cabccc" Dec 16 13:03:07 crc kubenswrapper[4757]: E1216 13:03:07.076734 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" podUID="e9f15431-d8cd-408d-8169-e06457cabccc" Dec 16 13:03:09 crc kubenswrapper[4757]: E1216 13:03:09.458858 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a" Dec 16 13:03:09 crc kubenswrapper[4757]: E1216 13:03:09.459403 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ngl8w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-66f8b87655-t9vm8_openstack-operators(a6449c1f-3695-445d-90b0-64b4c79cde05): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:03:09 crc kubenswrapper[4757]: E1216 13:03:09.461268 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8" podUID="a6449c1f-3695-445d-90b0-64b4c79cde05" Dec 16 13:03:10 crc kubenswrapper[4757]: E1216 13:03:10.090744 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a\\\"\"" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8" podUID="a6449c1f-3695-445d-90b0-64b4c79cde05" Dec 16 13:03:11 crc kubenswrapper[4757]: E1216 13:03:11.283065 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 16 13:03:11 crc kubenswrapper[4757]: E1216 13:03:11.283310 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m6qrh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-6ccf486b9-w9gps_openstack-operators(6c815add-abbd-4655-b257-d50ab074414a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:03:11 crc kubenswrapper[4757]: E1216 13:03:11.284686 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-w9gps" podUID="6c815add-abbd-4655-b257-d50ab074414a" Dec 16 13:03:12 crc kubenswrapper[4757]: E1216 13:03:12.142314 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-w9gps" podUID="6c815add-abbd-4655-b257-d50ab074414a" Dec 16 13:03:13 crc kubenswrapper[4757]: E1216 13:03:13.527421 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 16 13:03:13 crc kubenswrapper[4757]: E1216 13:03:13.527619 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b9c64,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-59b8dcb766-lvxk6_openstack-operators(b46e5138-a221-489d-9d7a-a54cf3938d64): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:03:13 crc kubenswrapper[4757]: E1216 13:03:13.529040 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-lvxk6" podUID="b46e5138-a221-489d-9d7a-a54cf3938d64" Dec 16 13:03:14 crc kubenswrapper[4757]: E1216 13:03:14.155857 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429\\\"\"" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-lvxk6" podUID="b46e5138-a221-489d-9d7a-a54cf3938d64" Dec 16 13:03:14 crc kubenswrapper[4757]: E1216 13:03:14.351392 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 16 13:03:14 crc kubenswrapper[4757]: E1216 13:03:14.351661 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rl4qj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bf6d4f946-x72bf_openstack-operators(60821702-232d-4eb4-b70f-15e87e070aed): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:03:14 crc kubenswrapper[4757]: E1216 13:03:14.352758 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-x72bf" podUID="60821702-232d-4eb4-b70f-15e87e070aed" Dec 16 13:03:15 crc kubenswrapper[4757]: E1216 13:03:15.161048 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-x72bf" podUID="60821702-232d-4eb4-b70f-15e87e070aed" Dec 16 13:03:15 crc kubenswrapper[4757]: E1216 13:03:15.977432 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87" Dec 16 13:03:15 crc kubenswrapper[4757]: E1216 13:03:15.979323 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-26hws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-f458558d7-45bgz_openstack-operators(6333c537-0505-48c0-b197-a609084a2a2c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:03:15 crc kubenswrapper[4757]: E1216 13:03:15.980602 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" podUID="6333c537-0505-48c0-b197-a609084a2a2c" Dec 16 13:03:16 crc kubenswrapper[4757]: E1216 13:03:16.167457 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" podUID="6333c537-0505-48c0-b197-a609084a2a2c" Dec 16 13:03:19 crc kubenswrapper[4757]: I1216 13:03:19.426119 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:03:19 crc kubenswrapper[4757]: I1216 13:03:19.435293 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7b6693c4-d7ad-4edc-ba55-baa2fea5094a-webhook-certs\") pod \"openstack-operator-controller-manager-554cfb9dfb-d6w2k\" (UID: \"7b6693c4-d7ad-4edc-ba55-baa2fea5094a\") " pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:03:19 crc kubenswrapper[4757]: I1216 13:03:19.711200 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-hwbjr" Dec 16 13:03:19 crc kubenswrapper[4757]: I1216 13:03:19.720374 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:03:20 crc kubenswrapper[4757]: E1216 13:03:20.774814 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea" Dec 16 13:03:20 crc kubenswrapper[4757]: E1216 13:03:20.775039 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-86cpb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-95949466-sgvcj_openstack-operators(34c17eba-d6e6-4399-a0a0-f25ef7a89fb9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:03:20 crc kubenswrapper[4757]: E1216 13:03:20.776204 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-95949466-sgvcj" podUID="34c17eba-d6e6-4399-a0a0-f25ef7a89fb9" Dec 16 13:03:21 crc kubenswrapper[4757]: E1216 13:03:21.213989 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-95949466-sgvcj" podUID="34c17eba-d6e6-4399-a0a0-f25ef7a89fb9" Dec 16 13:03:22 crc kubenswrapper[4757]: E1216 13:03:22.107713 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 16 13:03:22 crc kubenswrapper[4757]: E1216 13:03:22.107923 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rwdw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8665b56d78-bk25g_openstack-operators(28ec7b61-2e0c-4ad7-8569-eeb5973b976d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:03:22 crc kubenswrapper[4757]: E1216 13:03:22.109843 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-bk25g" podUID="28ec7b61-2e0c-4ad7-8569-eeb5973b976d" Dec 16 13:03:22 crc kubenswrapper[4757]: E1216 13:03:22.220416 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-bk25g" podUID="28ec7b61-2e0c-4ad7-8569-eeb5973b976d" Dec 16 13:03:22 crc kubenswrapper[4757]: E1216 13:03:22.621772 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:5639a8e1bbc8006cf0797de49b4c063c3531972e476c2257889bb66dac7fad8a" Dec 16 13:03:22 crc kubenswrapper[4757]: E1216 13:03:22.621997 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:5639a8e1bbc8006cf0797de49b4c063c3531972e476c2257889bb66dac7fad8a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jbtlm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-5f98b4754f-4z9jz_openstack-operators(3717fd56-4339-4ad6-940d-b5023c76d32f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:03:22 crc kubenswrapper[4757]: E1216 13:03:22.623814 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz" podUID="3717fd56-4339-4ad6-940d-b5023c76d32f" Dec 16 13:03:23 crc kubenswrapper[4757]: E1216 13:03:23.227194 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:5639a8e1bbc8006cf0797de49b4c063c3531972e476c2257889bb66dac7fad8a\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz" podUID="3717fd56-4339-4ad6-940d-b5023c76d32f" Dec 16 13:03:25 crc kubenswrapper[4757]: E1216 13:03:25.143937 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a" Dec 16 13:03:25 crc kubenswrapper[4757]: E1216 13:03:25.144479 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pdv2q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-5fdd9786f7-wg8wq_openstack-operators(eb0b72d5-d126-4513-8fbb-c7eed0a8f5e0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:03:25 crc kubenswrapper[4757]: E1216 13:03:25.145943 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-wg8wq" podUID="eb0b72d5-d126-4513-8fbb-c7eed0a8f5e0" Dec 16 13:03:25 crc kubenswrapper[4757]: E1216 13:03:25.246340 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a\\\"\"" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-wg8wq" podUID="eb0b72d5-d126-4513-8fbb-c7eed0a8f5e0" Dec 16 13:03:25 crc kubenswrapper[4757]: E1216 13:03:25.713824 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f" Dec 16 13:03:25 crc kubenswrapper[4757]: E1216 13:03:25.714069 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tf2jh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-97d456b9-85zkh_openstack-operators(ac2d53dd-c297-44b1-bcb1-a3025530eb5c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:03:25 crc kubenswrapper[4757]: E1216 13:03:25.715250 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-85zkh" podUID="ac2d53dd-c297-44b1-bcb1-a3025530eb5c" Dec 16 13:03:26 crc kubenswrapper[4757]: E1216 13:03:26.244677 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-85zkh" podUID="ac2d53dd-c297-44b1-bcb1-a3025530eb5c" Dec 16 13:03:27 crc kubenswrapper[4757]: E1216 13:03:27.613743 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad" Dec 16 13:03:27 crc kubenswrapper[4757]: E1216 13:03:27.614549 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4bxtf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-f76f4954c-psxvw_openstack-operators(75d829d5-a3cd-48c6-8aff-07f7d325b4f9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:03:27 crc kubenswrapper[4757]: E1216 13:03:27.616295 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-psxvw" podUID="75d829d5-a3cd-48c6-8aff-07f7d325b4f9" Dec 16 13:03:28 crc kubenswrapper[4757]: E1216 13:03:28.180372 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 16 13:03:28 crc kubenswrapper[4757]: E1216 13:03:28.180627 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gnspl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-756ccf86c7-w6kx8_openstack-operators(42d952f0-a650-484d-9e6b-b1c6c0f252dc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:03:28 crc kubenswrapper[4757]: E1216 13:03:28.181885 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podUID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" Dec 16 13:03:28 crc kubenswrapper[4757]: E1216 13:03:28.335754 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-psxvw" podUID="75d829d5-a3cd-48c6-8aff-07f7d325b4f9" Dec 16 13:03:28 crc kubenswrapper[4757]: E1216 13:03:28.808268 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 16 13:03:28 crc kubenswrapper[4757]: E1216 13:03:28.808813 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wdqcs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-7cd87b778f-v7jvg_openstack-operators(545806dc-d916-4704-bc27-f5a46915fb56): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:03:28 crc kubenswrapper[4757]: E1216 13:03:28.810423 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg" podUID="545806dc-d916-4704-bc27-f5a46915fb56" Dec 16 13:03:29 crc kubenswrapper[4757]: E1216 13:03:29.321669 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991" Dec 16 13:03:29 crc kubenswrapper[4757]: E1216 13:03:29.321904 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-slwp5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5c6df8f9-wgxxx_openstack-operators(120aab20-c2fb-441d-9c07-bd05c0678a11): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:03:29 crc kubenswrapper[4757]: E1216 13:03:29.323085 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-wgxxx" podUID="120aab20-c2fb-441d-9c07-bd05c0678a11" Dec 16 13:03:34 crc kubenswrapper[4757]: E1216 13:03:34.167956 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 16 13:03:34 crc kubenswrapper[4757]: E1216 13:03:34.168409 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qs9rw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-5c7cbf548f-wgrph_openstack-operators(72a6aea3-2309-4c98-802b-416feed1ba0f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:03:34 crc kubenswrapper[4757]: E1216 13:03:34.169572 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-wgrph" podUID="72a6aea3-2309-4c98-802b-416feed1ba0f" Dec 16 13:03:34 crc kubenswrapper[4757]: E1216 13:03:34.296725 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-wgrph" podUID="72a6aea3-2309-4c98-802b-416feed1ba0f" Dec 16 13:03:34 crc kubenswrapper[4757]: E1216 13:03:34.839902 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 16 13:03:34 crc kubenswrapper[4757]: E1216 13:03:34.840387 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2skq8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5fbbf8b6cc-bqfgm_openstack-operators(83154b06-c2df-4a44-9a33-4971cd60add3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:03:34 crc kubenswrapper[4757]: E1216 13:03:34.841490 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-bqfgm" podUID="83154b06-c2df-4a44-9a33-4971cd60add3" Dec 16 13:03:34 crc kubenswrapper[4757]: I1216 13:03:34.961689 4757 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 13:03:35 crc kubenswrapper[4757]: E1216 13:03:35.301504 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-bqfgm" podUID="83154b06-c2df-4a44-9a33-4971cd60add3" Dec 16 13:03:35 crc kubenswrapper[4757]: E1216 13:03:35.479539 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 16 13:03:35 crc kubenswrapper[4757]: E1216 13:03:35.479686 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-88f8x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-gcxgg_openstack-operators(7432087f-983f-4b3d-af98-40238ceba951): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:03:35 crc kubenswrapper[4757]: E1216 13:03:35.483367 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcxgg" podUID="7432087f-983f-4b3d-af98-40238ceba951" Dec 16 13:03:35 crc kubenswrapper[4757]: I1216 13:03:35.787723 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9"] Dec 16 13:03:35 crc kubenswrapper[4757]: W1216 13:03:35.814496 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67c1a6c7_35d1_48a9_a058_13e5d5599fe7.slice/crio-5b7d819f8feb0c9c85063bcf2a8db87ce3eedc146736c9d03d113d2c661dcdf4 WatchSource:0}: Error finding container 5b7d819f8feb0c9c85063bcf2a8db87ce3eedc146736c9d03d113d2c661dcdf4: Status 404 returned error can't find the container with id 5b7d819f8feb0c9c85063bcf2a8db87ce3eedc146736c9d03d113d2c661dcdf4 Dec 16 13:03:35 crc kubenswrapper[4757]: I1216 13:03:35.919338 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-84b495f78-fw274"] Dec 16 13:03:35 crc kubenswrapper[4757]: W1216 13:03:35.932052 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod904525e7_6f82_4fbf_928a_99194a97829a.slice/crio-cc8ea2a8022af5ef623a1c77bac32e66417f95c68d14d5ac8e75f6aa0399532f WatchSource:0}: Error finding container cc8ea2a8022af5ef623a1c77bac32e66417f95c68d14d5ac8e75f6aa0399532f: Status 404 returned error can't find the container with id cc8ea2a8022af5ef623a1c77bac32e66417f95c68d14d5ac8e75f6aa0399532f Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.041611 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k"] Dec 16 13:03:36 crc kubenswrapper[4757]: W1216 13:03:36.058600 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b6693c4_d7ad_4edc_ba55_baa2fea5094a.slice/crio-792bbb08c6aec376636593b31b2151b929b04fa98865b7a09caf906ad5b4540a WatchSource:0}: Error finding container 792bbb08c6aec376636593b31b2151b929b04fa98865b7a09caf906ad5b4540a: Status 404 returned error can't find the container with id 792bbb08c6aec376636593b31b2151b929b04fa98865b7a09caf906ad5b4540a Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.307884 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dr2qv" event={"ID":"ee4f2b54-1f7b-469d-9d41-ad4d57c3bf89","Type":"ContainerStarted","Data":"ab77c69461aefeb3ee4b0e11cb67faf4a056bfd852701f23f78889e34b5d05a3"} Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.307988 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dr2qv" Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.309861 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" event={"ID":"7b6693c4-d7ad-4edc-ba55-baa2fea5094a","Type":"ContainerStarted","Data":"1b0146996427b5480391ccfcaf16467660ad01fc565f392f1d792317bc60cb9c"} Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.309907 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" event={"ID":"7b6693c4-d7ad-4edc-ba55-baa2fea5094a","Type":"ContainerStarted","Data":"792bbb08c6aec376636593b31b2151b929b04fa98865b7a09caf906ad5b4540a"} Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.310044 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.311427 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" event={"ID":"e9f15431-d8cd-408d-8169-e06457cabccc","Type":"ContainerStarted","Data":"873f6a8ede2fa9237135bad178b8e1733170cab1fde491de8c0d92c1f749eb42"} Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.311663 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.313791 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-lvxk6" event={"ID":"b46e5138-a221-489d-9d7a-a54cf3938d64","Type":"ContainerStarted","Data":"dc6cb27d10b36cdfc8f28dfa5a281abedf781f6cfa686ba7f2e36bd9b2a9b4a2"} Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.313966 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-lvxk6" Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.315746 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-w9gps" event={"ID":"6c815add-abbd-4655-b257-d50ab074414a","Type":"ContainerStarted","Data":"5078131b0d991ad8da4b1d4040f82848187bcb373ed47691899f14d2c2d245c4"} Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.315939 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-w9gps" Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.317127 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" event={"ID":"67c1a6c7-35d1-48a9-a058-13e5d5599fe7","Type":"ContainerStarted","Data":"5b7d819f8feb0c9c85063bcf2a8db87ce3eedc146736c9d03d113d2c661dcdf4"} Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.318922 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8" event={"ID":"a6449c1f-3695-445d-90b0-64b4c79cde05","Type":"ContainerStarted","Data":"0e96a81a682a19775b899ea79d1adc3ab0d7cb597c6276c988e4f7bccffc08de"} Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.319225 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8" Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.320210 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84b495f78-fw274" event={"ID":"904525e7-6f82-4fbf-928a-99194a97829a","Type":"ContainerStarted","Data":"cc8ea2a8022af5ef623a1c77bac32e66417f95c68d14d5ac8e75f6aa0399532f"} Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.322141 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" event={"ID":"6333c537-0505-48c0-b197-a609084a2a2c","Type":"ContainerStarted","Data":"4ade807f82acd56ef4e7773356e3af9982ed2216a9f53ba2729b438b82747b0c"} Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.322321 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.324057 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-qtkjq" event={"ID":"fbd5f746-9483-455c-988e-2e882623d09e","Type":"ContainerStarted","Data":"92ba53963ba5bf36ad18ca1f7d08e9237531e7b884dc8fad260b5c9534165c23"} Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.324195 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-qtkjq" Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.326055 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-x72bf" event={"ID":"60821702-232d-4eb4-b70f-15e87e070aed","Type":"ContainerStarted","Data":"4ab7e2892ef3c2132e2ec2e92756b64e506f03ed530bcd75530853c386a4f8e8"} Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.326258 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-x72bf" Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.503423 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-w9gps" podStartSLOduration=5.024583649 podStartE2EDuration="50.503404997s" podCreationTimestamp="2025-12-16 13:02:46 +0000 UTC" firstStartedPulling="2025-12-16 13:02:49.958290724 +0000 UTC m=+955.386034520" lastFinishedPulling="2025-12-16 13:03:35.437112072 +0000 UTC m=+1000.864855868" observedRunningTime="2025-12-16 13:03:36.50025554 +0000 UTC m=+1001.927999336" watchObservedRunningTime="2025-12-16 13:03:36.503404997 +0000 UTC m=+1001.931148793" Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.503753 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dr2qv" podStartSLOduration=7.817379928 podStartE2EDuration="50.503746545s" podCreationTimestamp="2025-12-16 13:02:46 +0000 UTC" firstStartedPulling="2025-12-16 13:02:50.099673465 +0000 UTC m=+955.527417261" lastFinishedPulling="2025-12-16 13:03:32.786040082 +0000 UTC m=+998.213783878" observedRunningTime="2025-12-16 13:03:36.398558139 +0000 UTC m=+1001.826301965" watchObservedRunningTime="2025-12-16 13:03:36.503746545 +0000 UTC m=+1001.931490341" Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.552535 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" podStartSLOduration=5.155992653 podStartE2EDuration="50.552492179s" podCreationTimestamp="2025-12-16 13:02:46 +0000 UTC" firstStartedPulling="2025-12-16 13:02:50.049691351 +0000 UTC m=+955.477435147" lastFinishedPulling="2025-12-16 13:03:35.446190877 +0000 UTC m=+1000.873934673" observedRunningTime="2025-12-16 13:03:36.551630577 +0000 UTC m=+1001.979374373" watchObservedRunningTime="2025-12-16 13:03:36.552492179 +0000 UTC m=+1001.980235985" Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.660605 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8" podStartSLOduration=5.626761236 podStartE2EDuration="51.660585537s" podCreationTimestamp="2025-12-16 13:02:45 +0000 UTC" firstStartedPulling="2025-12-16 13:02:49.418590339 +0000 UTC m=+954.846334135" lastFinishedPulling="2025-12-16 13:03:35.45241464 +0000 UTC m=+1000.880158436" observedRunningTime="2025-12-16 13:03:36.63275896 +0000 UTC m=+1002.060502766" watchObservedRunningTime="2025-12-16 13:03:36.660585537 +0000 UTC m=+1002.088329333" Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.660980 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" podStartSLOduration=4.527035574 podStartE2EDuration="50.660975257s" podCreationTimestamp="2025-12-16 13:02:46 +0000 UTC" firstStartedPulling="2025-12-16 13:02:49.311130786 +0000 UTC m=+954.738874582" lastFinishedPulling="2025-12-16 13:03:35.445070469 +0000 UTC m=+1000.872814265" observedRunningTime="2025-12-16 13:03:36.659736206 +0000 UTC m=+1002.087480002" watchObservedRunningTime="2025-12-16 13:03:36.660975257 +0000 UTC m=+1002.088719053" Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.741625 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" podStartSLOduration=50.741601748 podStartE2EDuration="50.741601748s" podCreationTimestamp="2025-12-16 13:02:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:03:36.73643569 +0000 UTC m=+1002.164179486" watchObservedRunningTime="2025-12-16 13:03:36.741601748 +0000 UTC m=+1002.169345544" Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.774394 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-lvxk6" podStartSLOduration=5.806168246 podStartE2EDuration="51.774375106s" podCreationTimestamp="2025-12-16 13:02:45 +0000 UTC" firstStartedPulling="2025-12-16 13:02:49.510461368 +0000 UTC m=+954.938205164" lastFinishedPulling="2025-12-16 13:03:35.478668228 +0000 UTC m=+1000.906412024" observedRunningTime="2025-12-16 13:03:36.773831713 +0000 UTC m=+1002.201575509" watchObservedRunningTime="2025-12-16 13:03:36.774375106 +0000 UTC m=+1002.202118892" Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.802085 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-qtkjq" podStartSLOduration=11.190578337 podStartE2EDuration="51.80206766s" podCreationTimestamp="2025-12-16 13:02:45 +0000 UTC" firstStartedPulling="2025-12-16 13:02:48.192049478 +0000 UTC m=+953.619793274" lastFinishedPulling="2025-12-16 13:03:28.803538801 +0000 UTC m=+994.231282597" observedRunningTime="2025-12-16 13:03:36.795843027 +0000 UTC m=+1002.223586823" watchObservedRunningTime="2025-12-16 13:03:36.80206766 +0000 UTC m=+1002.229811456" Dec 16 13:03:36 crc kubenswrapper[4757]: I1216 13:03:36.823132 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-x72bf" podStartSLOduration=5.348256869 podStartE2EDuration="50.823025448s" podCreationTimestamp="2025-12-16 13:02:46 +0000 UTC" firstStartedPulling="2025-12-16 13:02:49.971548081 +0000 UTC m=+955.399291877" lastFinishedPulling="2025-12-16 13:03:35.44631666 +0000 UTC m=+1000.874060456" observedRunningTime="2025-12-16 13:03:36.819032499 +0000 UTC m=+1002.246776295" watchObservedRunningTime="2025-12-16 13:03:36.823025448 +0000 UTC m=+1002.250769244" Dec 16 13:03:39 crc kubenswrapper[4757]: I1216 13:03:39.346667 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-95949466-sgvcj" event={"ID":"34c17eba-d6e6-4399-a0a0-f25ef7a89fb9","Type":"ContainerStarted","Data":"19fd70f8f385284f028dde8aebcc51c39241e5fba5c2a108c94b207047b48735"} Dec 16 13:03:39 crc kubenswrapper[4757]: I1216 13:03:39.347434 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-95949466-sgvcj" Dec 16 13:03:39 crc kubenswrapper[4757]: I1216 13:03:39.371026 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-95949466-sgvcj" podStartSLOduration=5.485673934 podStartE2EDuration="54.370981762s" podCreationTimestamp="2025-12-16 13:02:45 +0000 UTC" firstStartedPulling="2025-12-16 13:02:49.424159627 +0000 UTC m=+954.851903423" lastFinishedPulling="2025-12-16 13:03:38.309467455 +0000 UTC m=+1003.737211251" observedRunningTime="2025-12-16 13:03:39.362428871 +0000 UTC m=+1004.790172687" watchObservedRunningTime="2025-12-16 13:03:39.370981762 +0000 UTC m=+1004.798725568" Dec 16 13:03:40 crc kubenswrapper[4757]: E1216 13:03:40.956301 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg" podUID="545806dc-d916-4704-bc27-f5a46915fb56" Dec 16 13:03:41 crc kubenswrapper[4757]: I1216 13:03:41.382437 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" event={"ID":"67c1a6c7-35d1-48a9-a058-13e5d5599fe7","Type":"ContainerStarted","Data":"470eb1409f4aed957b7b2a9038ea8e32c408a7d6788cc603bb5830d7c3eb700a"} Dec 16 13:03:41 crc kubenswrapper[4757]: I1216 13:03:41.382617 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:03:41 crc kubenswrapper[4757]: I1216 13:03:41.384322 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84b495f78-fw274" event={"ID":"904525e7-6f82-4fbf-928a-99194a97829a","Type":"ContainerStarted","Data":"d1195da45031af497e20b5bf153203a9e56cacbee8f1db0eb9a3707fb8c6ab19"} Dec 16 13:03:41 crc kubenswrapper[4757]: I1216 13:03:41.384470 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-84b495f78-fw274" Dec 16 13:03:41 crc kubenswrapper[4757]: I1216 13:03:41.385744 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz" event={"ID":"3717fd56-4339-4ad6-940d-b5023c76d32f","Type":"ContainerStarted","Data":"0a886290841f66c8a939ca85d74a0da3dd45779a873263fae160e5ab39ced3e5"} Dec 16 13:03:41 crc kubenswrapper[4757]: I1216 13:03:41.386093 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz" Dec 16 13:03:41 crc kubenswrapper[4757]: I1216 13:03:41.388558 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-85zkh" event={"ID":"ac2d53dd-c297-44b1-bcb1-a3025530eb5c","Type":"ContainerStarted","Data":"5bec7ca51a73e79d51862db1f27fecca2dc19ee3017b9ee8af5acf6fd3aaf4fd"} Dec 16 13:03:41 crc kubenswrapper[4757]: I1216 13:03:41.388816 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-85zkh" Dec 16 13:03:41 crc kubenswrapper[4757]: I1216 13:03:41.389918 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-bk25g" event={"ID":"28ec7b61-2e0c-4ad7-8569-eeb5973b976d","Type":"ContainerStarted","Data":"795384b3315a694a2e43c1f899d7a717f74d540193112021bd71c9c8ba957558"} Dec 16 13:03:41 crc kubenswrapper[4757]: I1216 13:03:41.390192 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-bk25g" Dec 16 13:03:41 crc kubenswrapper[4757]: I1216 13:03:41.427218 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" podStartSLOduration=50.347381254 podStartE2EDuration="55.427197515s" podCreationTimestamp="2025-12-16 13:02:46 +0000 UTC" firstStartedPulling="2025-12-16 13:03:35.82182446 +0000 UTC m=+1001.249568256" lastFinishedPulling="2025-12-16 13:03:40.901640721 +0000 UTC m=+1006.329384517" observedRunningTime="2025-12-16 13:03:41.421073855 +0000 UTC m=+1006.848817651" watchObservedRunningTime="2025-12-16 13:03:41.427197515 +0000 UTC m=+1006.854941311" Dec 16 13:03:41 crc kubenswrapper[4757]: I1216 13:03:41.443218 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-85zkh" podStartSLOduration=4.807976961 podStartE2EDuration="55.443200411s" podCreationTimestamp="2025-12-16 13:02:46 +0000 UTC" firstStartedPulling="2025-12-16 13:02:50.267546389 +0000 UTC m=+955.695290185" lastFinishedPulling="2025-12-16 13:03:40.902769849 +0000 UTC m=+1006.330513635" observedRunningTime="2025-12-16 13:03:41.440940375 +0000 UTC m=+1006.868684171" watchObservedRunningTime="2025-12-16 13:03:41.443200411 +0000 UTC m=+1006.870944207" Dec 16 13:03:41 crc kubenswrapper[4757]: I1216 13:03:41.501965 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-84b495f78-fw274" podStartSLOduration=50.538345049 podStartE2EDuration="55.501942661s" podCreationTimestamp="2025-12-16 13:02:46 +0000 UTC" firstStartedPulling="2025-12-16 13:03:35.937614879 +0000 UTC m=+1001.365358675" lastFinishedPulling="2025-12-16 13:03:40.901212491 +0000 UTC m=+1006.328956287" observedRunningTime="2025-12-16 13:03:41.49868136 +0000 UTC m=+1006.926425156" watchObservedRunningTime="2025-12-16 13:03:41.501942661 +0000 UTC m=+1006.929686457" Dec 16 13:03:41 crc kubenswrapper[4757]: I1216 13:03:41.502749 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-bk25g" podStartSLOduration=4.935782776 podStartE2EDuration="55.502743441s" podCreationTimestamp="2025-12-16 13:02:46 +0000 UTC" firstStartedPulling="2025-12-16 13:02:50.341486484 +0000 UTC m=+955.769230280" lastFinishedPulling="2025-12-16 13:03:40.908447149 +0000 UTC m=+1006.336190945" observedRunningTime="2025-12-16 13:03:41.473975891 +0000 UTC m=+1006.901719707" watchObservedRunningTime="2025-12-16 13:03:41.502743441 +0000 UTC m=+1006.930487237" Dec 16 13:03:42 crc kubenswrapper[4757]: E1216 13:03:42.951835 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-wgxxx" podUID="120aab20-c2fb-441d-9c07-bd05c0678a11" Dec 16 13:03:42 crc kubenswrapper[4757]: E1216 13:03:42.952172 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podUID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" Dec 16 13:03:42 crc kubenswrapper[4757]: I1216 13:03:42.973742 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz" podStartSLOduration=6.506044865 podStartE2EDuration="57.973717687s" podCreationTimestamp="2025-12-16 13:02:45 +0000 UTC" firstStartedPulling="2025-12-16 13:02:49.437115046 +0000 UTC m=+954.864858852" lastFinishedPulling="2025-12-16 13:03:40.904787878 +0000 UTC m=+1006.332531674" observedRunningTime="2025-12-16 13:03:41.536863924 +0000 UTC m=+1006.964607720" watchObservedRunningTime="2025-12-16 13:03:42.973717687 +0000 UTC m=+1008.401461483" Dec 16 13:03:43 crc kubenswrapper[4757]: I1216 13:03:43.401587 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-wg8wq" event={"ID":"eb0b72d5-d126-4513-8fbb-c7eed0a8f5e0","Type":"ContainerStarted","Data":"57b54f9d55f9731af2fb70909cb01f125e0384261e2bbe29a28c13eff9f7c3f1"} Dec 16 13:03:43 crc kubenswrapper[4757]: I1216 13:03:43.402891 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-wg8wq" Dec 16 13:03:44 crc kubenswrapper[4757]: I1216 13:03:44.409097 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-psxvw" event={"ID":"75d829d5-a3cd-48c6-8aff-07f7d325b4f9","Type":"ContainerStarted","Data":"8d06b6d47a466969f024cc06fdc142b8d58e2e2f94c8632966887fb946b051cb"} Dec 16 13:03:44 crc kubenswrapper[4757]: I1216 13:03:44.409409 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-psxvw" Dec 16 13:03:44 crc kubenswrapper[4757]: I1216 13:03:44.434534 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-psxvw" podStartSLOduration=4.457698763 podStartE2EDuration="58.434516641s" podCreationTimestamp="2025-12-16 13:02:46 +0000 UTC" firstStartedPulling="2025-12-16 13:02:49.567925176 +0000 UTC m=+954.995668972" lastFinishedPulling="2025-12-16 13:03:43.544743054 +0000 UTC m=+1008.972486850" observedRunningTime="2025-12-16 13:03:44.432125132 +0000 UTC m=+1009.859868948" watchObservedRunningTime="2025-12-16 13:03:44.434516641 +0000 UTC m=+1009.862260437" Dec 16 13:03:44 crc kubenswrapper[4757]: I1216 13:03:44.437484 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-wg8wq" podStartSLOduration=5.409205654 podStartE2EDuration="58.437472973s" podCreationTimestamp="2025-12-16 13:02:46 +0000 UTC" firstStartedPulling="2025-12-16 13:02:50.055180416 +0000 UTC m=+955.482924212" lastFinishedPulling="2025-12-16 13:03:43.083447735 +0000 UTC m=+1008.511191531" observedRunningTime="2025-12-16 13:03:43.442315686 +0000 UTC m=+1008.870059482" watchObservedRunningTime="2025-12-16 13:03:44.437472973 +0000 UTC m=+1009.865216769" Dec 16 13:03:46 crc kubenswrapper[4757]: I1216 13:03:46.249549 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8" Dec 16 13:03:46 crc kubenswrapper[4757]: I1216 13:03:46.271617 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-qtkjq" Dec 16 13:03:46 crc kubenswrapper[4757]: I1216 13:03:46.280602 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-lvxk6" Dec 16 13:03:46 crc kubenswrapper[4757]: I1216 13:03:46.511194 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz" Dec 16 13:03:46 crc kubenswrapper[4757]: I1216 13:03:46.531047 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-95949466-sgvcj" Dec 16 13:03:46 crc kubenswrapper[4757]: I1216 13:03:46.628229 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" Dec 16 13:03:46 crc kubenswrapper[4757]: I1216 13:03:46.886295 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-w9gps" Dec 16 13:03:46 crc kubenswrapper[4757]: I1216 13:03:46.891429 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" Dec 16 13:03:47 crc kubenswrapper[4757]: I1216 13:03:47.179585 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-bk25g" Dec 16 13:03:47 crc kubenswrapper[4757]: I1216 13:03:47.294761 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-x72bf" Dec 16 13:03:47 crc kubenswrapper[4757]: I1216 13:03:47.807819 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-85zkh" Dec 16 13:03:47 crc kubenswrapper[4757]: I1216 13:03:47.884346 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dr2qv" Dec 16 13:03:48 crc kubenswrapper[4757]: E1216 13:03:48.950058 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcxgg" podUID="7432087f-983f-4b3d-af98-40238ceba951" Dec 16 13:03:49 crc kubenswrapper[4757]: I1216 13:03:49.443538 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-bqfgm" event={"ID":"83154b06-c2df-4a44-9a33-4971cd60add3","Type":"ContainerStarted","Data":"c942d78c400e83c1c3ff7ccfe79ad5b285295d06bce5abde2e0c155139dd5246"} Dec 16 13:03:49 crc kubenswrapper[4757]: I1216 13:03:49.443863 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-bqfgm" Dec 16 13:03:49 crc kubenswrapper[4757]: I1216 13:03:49.461613 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-bqfgm" podStartSLOduration=4.177217378 podStartE2EDuration="1m3.46159284s" podCreationTimestamp="2025-12-16 13:02:46 +0000 UTC" firstStartedPulling="2025-12-16 13:02:49.511136514 +0000 UTC m=+954.938880310" lastFinishedPulling="2025-12-16 13:03:48.795511956 +0000 UTC m=+1014.223255772" observedRunningTime="2025-12-16 13:03:49.458957595 +0000 UTC m=+1014.886701391" watchObservedRunningTime="2025-12-16 13:03:49.46159284 +0000 UTC m=+1014.889336636" Dec 16 13:03:49 crc kubenswrapper[4757]: I1216 13:03:49.728084 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-554cfb9dfb-d6w2k" Dec 16 13:03:51 crc kubenswrapper[4757]: I1216 13:03:51.458490 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-wgrph" event={"ID":"72a6aea3-2309-4c98-802b-416feed1ba0f","Type":"ContainerStarted","Data":"a7dad38154efb5cc328e6fcacaee79914a499164ba97d5811417409ced2f8f92"} Dec 16 13:03:51 crc kubenswrapper[4757]: I1216 13:03:51.458984 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-wgrph" Dec 16 13:03:51 crc kubenswrapper[4757]: I1216 13:03:51.480138 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-wgrph" podStartSLOduration=4.94162041 podStartE2EDuration="1m5.480121563s" podCreationTimestamp="2025-12-16 13:02:46 +0000 UTC" firstStartedPulling="2025-12-16 13:02:49.842394293 +0000 UTC m=+955.270138089" lastFinishedPulling="2025-12-16 13:03:50.380895436 +0000 UTC m=+1015.808639242" observedRunningTime="2025-12-16 13:03:51.475029558 +0000 UTC m=+1016.902773344" watchObservedRunningTime="2025-12-16 13:03:51.480121563 +0000 UTC m=+1016.907865359" Dec 16 13:03:52 crc kubenswrapper[4757]: I1216 13:03:52.376877 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-84b495f78-fw274" Dec 16 13:03:53 crc kubenswrapper[4757]: I1216 13:03:53.368187 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:03:54 crc kubenswrapper[4757]: I1216 13:03:54.480257 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg" event={"ID":"545806dc-d916-4704-bc27-f5a46915fb56","Type":"ContainerStarted","Data":"9b044e2e970822ad2de4ddc6a4fea44b90230eefd896790b5108c0e47a5c88c3"} Dec 16 13:03:54 crc kubenswrapper[4757]: I1216 13:03:54.480799 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg" Dec 16 13:03:55 crc kubenswrapper[4757]: I1216 13:03:55.488238 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" event={"ID":"42d952f0-a650-484d-9e6b-b1c6c0f252dc","Type":"ContainerStarted","Data":"35a1bd5dd410692a492294e2f3487d2cfd04f9b69a087852025e7589c08d90b9"} Dec 16 13:03:55 crc kubenswrapper[4757]: I1216 13:03:55.490227 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" Dec 16 13:03:55 crc kubenswrapper[4757]: I1216 13:03:55.504105 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg" podStartSLOduration=6.146368973 podStartE2EDuration="1m9.504089697s" podCreationTimestamp="2025-12-16 13:02:46 +0000 UTC" firstStartedPulling="2025-12-16 13:02:50.267767815 +0000 UTC m=+955.695511611" lastFinishedPulling="2025-12-16 13:03:53.625488539 +0000 UTC m=+1019.053232335" observedRunningTime="2025-12-16 13:03:54.514589429 +0000 UTC m=+1019.942333235" watchObservedRunningTime="2025-12-16 13:03:55.504089697 +0000 UTC m=+1020.931833493" Dec 16 13:03:55 crc kubenswrapper[4757]: I1216 13:03:55.505822 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podStartSLOduration=5.030185718 podStartE2EDuration="1m9.505814001s" podCreationTimestamp="2025-12-16 13:02:46 +0000 UTC" firstStartedPulling="2025-12-16 13:02:50.307321461 +0000 UTC m=+955.735065257" lastFinishedPulling="2025-12-16 13:03:54.782949734 +0000 UTC m=+1020.210693540" observedRunningTime="2025-12-16 13:03:55.501444472 +0000 UTC m=+1020.929188268" watchObservedRunningTime="2025-12-16 13:03:55.505814001 +0000 UTC m=+1020.933557797" Dec 16 13:03:56 crc kubenswrapper[4757]: I1216 13:03:56.911605 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-psxvw" Dec 16 13:03:56 crc kubenswrapper[4757]: I1216 13:03:56.938085 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-bqfgm" Dec 16 13:03:57 crc kubenswrapper[4757]: I1216 13:03:57.031484 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-wg8wq" Dec 16 13:03:57 crc kubenswrapper[4757]: I1216 13:03:57.032782 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-wgrph" Dec 16 13:03:59 crc kubenswrapper[4757]: I1216 13:03:59.514498 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-wgxxx" event={"ID":"120aab20-c2fb-441d-9c07-bd05c0678a11","Type":"ContainerStarted","Data":"79c6466db646f814ceb98ef33fcc4f42062ee846f07ad09524a63bdf017181b6"} Dec 16 13:03:59 crc kubenswrapper[4757]: I1216 13:03:59.515064 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-wgxxx" Dec 16 13:03:59 crc kubenswrapper[4757]: I1216 13:03:59.531146 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-wgxxx" podStartSLOduration=4.9415995299999995 podStartE2EDuration="1m13.531126227s" podCreationTimestamp="2025-12-16 13:02:46 +0000 UTC" firstStartedPulling="2025-12-16 13:02:50.104317899 +0000 UTC m=+955.532061695" lastFinishedPulling="2025-12-16 13:03:58.693844596 +0000 UTC m=+1024.121588392" observedRunningTime="2025-12-16 13:03:59.528196385 +0000 UTC m=+1024.955940201" watchObservedRunningTime="2025-12-16 13:03:59.531126227 +0000 UTC m=+1024.958870023" Dec 16 13:04:01 crc kubenswrapper[4757]: I1216 13:04:01.529459 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcxgg" event={"ID":"7432087f-983f-4b3d-af98-40238ceba951","Type":"ContainerStarted","Data":"21c3da7bad675a6c97ec721eded3129be22cd00d9b49c847e68b81ef10a40e4e"} Dec 16 13:04:01 crc kubenswrapper[4757]: I1216 13:04:01.548280 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcxgg" podStartSLOduration=5.057496721 podStartE2EDuration="1m15.548253996s" podCreationTimestamp="2025-12-16 13:02:46 +0000 UTC" firstStartedPulling="2025-12-16 13:02:50.100943716 +0000 UTC m=+955.528687512" lastFinishedPulling="2025-12-16 13:04:00.591700981 +0000 UTC m=+1026.019444787" observedRunningTime="2025-12-16 13:04:01.547281452 +0000 UTC m=+1026.975025258" watchObservedRunningTime="2025-12-16 13:04:01.548253996 +0000 UTC m=+1026.975997792" Dec 16 13:04:07 crc kubenswrapper[4757]: I1216 13:04:07.084313 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-v7jvg" Dec 16 13:04:07 crc kubenswrapper[4757]: I1216 13:04:07.355142 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-wgxxx" Dec 16 13:04:07 crc kubenswrapper[4757]: I1216 13:04:07.867549 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" Dec 16 13:04:21 crc kubenswrapper[4757]: I1216 13:04:21.181751 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:04:21 crc kubenswrapper[4757]: I1216 13:04:21.183368 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.729466 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dfc6j"] Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.731046 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dfc6j" Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.742243 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.742593 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.742906 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dfc6j"] Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.744259 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.747518 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-84djb" Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.760363 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd3cbd0-b882-434b-b6ae-62e6bc9831be-config\") pod \"dnsmasq-dns-675f4bcbfc-dfc6j\" (UID: \"6cd3cbd0-b882-434b-b6ae-62e6bc9831be\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dfc6j" Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.760448 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vqcn\" (UniqueName: \"kubernetes.io/projected/6cd3cbd0-b882-434b-b6ae-62e6bc9831be-kube-api-access-8vqcn\") pod \"dnsmasq-dns-675f4bcbfc-dfc6j\" (UID: \"6cd3cbd0-b882-434b-b6ae-62e6bc9831be\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dfc6j" Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.848680 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bxfw6"] Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.849683 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bxfw6" Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.854490 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.861357 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd3cbd0-b882-434b-b6ae-62e6bc9831be-config\") pod \"dnsmasq-dns-675f4bcbfc-dfc6j\" (UID: \"6cd3cbd0-b882-434b-b6ae-62e6bc9831be\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dfc6j" Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.861463 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vqcn\" (UniqueName: \"kubernetes.io/projected/6cd3cbd0-b882-434b-b6ae-62e6bc9831be-kube-api-access-8vqcn\") pod \"dnsmasq-dns-675f4bcbfc-dfc6j\" (UID: \"6cd3cbd0-b882-434b-b6ae-62e6bc9831be\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dfc6j" Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.862467 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd3cbd0-b882-434b-b6ae-62e6bc9831be-config\") pod \"dnsmasq-dns-675f4bcbfc-dfc6j\" (UID: \"6cd3cbd0-b882-434b-b6ae-62e6bc9831be\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dfc6j" Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.877814 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bxfw6"] Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.910244 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vqcn\" (UniqueName: \"kubernetes.io/projected/6cd3cbd0-b882-434b-b6ae-62e6bc9831be-kube-api-access-8vqcn\") pod \"dnsmasq-dns-675f4bcbfc-dfc6j\" (UID: \"6cd3cbd0-b882-434b-b6ae-62e6bc9831be\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dfc6j" Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.963221 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqv2d\" (UniqueName: \"kubernetes.io/projected/0566503d-6ae6-445d-b934-1910ca733474-kube-api-access-mqv2d\") pod \"dnsmasq-dns-78dd6ddcc-bxfw6\" (UID: \"0566503d-6ae6-445d-b934-1910ca733474\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bxfw6" Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.963291 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0566503d-6ae6-445d-b934-1910ca733474-config\") pod \"dnsmasq-dns-78dd6ddcc-bxfw6\" (UID: \"0566503d-6ae6-445d-b934-1910ca733474\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bxfw6" Dec 16 13:04:22 crc kubenswrapper[4757]: I1216 13:04:22.963332 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0566503d-6ae6-445d-b934-1910ca733474-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bxfw6\" (UID: \"0566503d-6ae6-445d-b934-1910ca733474\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bxfw6" Dec 16 13:04:23 crc kubenswrapper[4757]: I1216 13:04:23.046797 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dfc6j" Dec 16 13:04:23 crc kubenswrapper[4757]: I1216 13:04:23.065171 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0566503d-6ae6-445d-b934-1910ca733474-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bxfw6\" (UID: \"0566503d-6ae6-445d-b934-1910ca733474\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bxfw6" Dec 16 13:04:23 crc kubenswrapper[4757]: I1216 13:04:23.065287 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqv2d\" (UniqueName: \"kubernetes.io/projected/0566503d-6ae6-445d-b934-1910ca733474-kube-api-access-mqv2d\") pod \"dnsmasq-dns-78dd6ddcc-bxfw6\" (UID: \"0566503d-6ae6-445d-b934-1910ca733474\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bxfw6" Dec 16 13:04:23 crc kubenswrapper[4757]: I1216 13:04:23.065308 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0566503d-6ae6-445d-b934-1910ca733474-config\") pod \"dnsmasq-dns-78dd6ddcc-bxfw6\" (UID: \"0566503d-6ae6-445d-b934-1910ca733474\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bxfw6" Dec 16 13:04:23 crc kubenswrapper[4757]: I1216 13:04:23.066313 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0566503d-6ae6-445d-b934-1910ca733474-config\") pod \"dnsmasq-dns-78dd6ddcc-bxfw6\" (UID: \"0566503d-6ae6-445d-b934-1910ca733474\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bxfw6" Dec 16 13:04:23 crc kubenswrapper[4757]: I1216 13:04:23.066868 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0566503d-6ae6-445d-b934-1910ca733474-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bxfw6\" (UID: \"0566503d-6ae6-445d-b934-1910ca733474\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bxfw6" Dec 16 13:04:23 crc kubenswrapper[4757]: I1216 13:04:23.101048 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqv2d\" (UniqueName: \"kubernetes.io/projected/0566503d-6ae6-445d-b934-1910ca733474-kube-api-access-mqv2d\") pod \"dnsmasq-dns-78dd6ddcc-bxfw6\" (UID: \"0566503d-6ae6-445d-b934-1910ca733474\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bxfw6" Dec 16 13:04:23 crc kubenswrapper[4757]: I1216 13:04:23.163350 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bxfw6" Dec 16 13:04:23 crc kubenswrapper[4757]: I1216 13:04:23.511112 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dfc6j"] Dec 16 13:04:23 crc kubenswrapper[4757]: W1216 13:04:23.515619 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cd3cbd0_b882_434b_b6ae_62e6bc9831be.slice/crio-8127aff482668694557b562dcc189024186e4763dfa0a0aefccd8442e46633b2 WatchSource:0}: Error finding container 8127aff482668694557b562dcc189024186e4763dfa0a0aefccd8442e46633b2: Status 404 returned error can't find the container with id 8127aff482668694557b562dcc189024186e4763dfa0a0aefccd8442e46633b2 Dec 16 13:04:23 crc kubenswrapper[4757]: I1216 13:04:23.652305 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bxfw6"] Dec 16 13:04:23 crc kubenswrapper[4757]: W1216 13:04:23.657127 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0566503d_6ae6_445d_b934_1910ca733474.slice/crio-51757cb1c7a6375fe9ac898eae7b1cccdc7d8efe3f65eb365fb7a23bf84962cb WatchSource:0}: Error finding container 51757cb1c7a6375fe9ac898eae7b1cccdc7d8efe3f65eb365fb7a23bf84962cb: Status 404 returned error can't find the container with id 51757cb1c7a6375fe9ac898eae7b1cccdc7d8efe3f65eb365fb7a23bf84962cb Dec 16 13:04:23 crc kubenswrapper[4757]: I1216 13:04:23.693671 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-bxfw6" event={"ID":"0566503d-6ae6-445d-b934-1910ca733474","Type":"ContainerStarted","Data":"51757cb1c7a6375fe9ac898eae7b1cccdc7d8efe3f65eb365fb7a23bf84962cb"} Dec 16 13:04:23 crc kubenswrapper[4757]: I1216 13:04:23.695046 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dfc6j" event={"ID":"6cd3cbd0-b882-434b-b6ae-62e6bc9831be","Type":"ContainerStarted","Data":"8127aff482668694557b562dcc189024186e4763dfa0a0aefccd8442e46633b2"} Dec 16 13:04:25 crc kubenswrapper[4757]: I1216 13:04:25.647910 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dfc6j"] Dec 16 13:04:25 crc kubenswrapper[4757]: I1216 13:04:25.685144 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-gsqln"] Dec 16 13:04:25 crc kubenswrapper[4757]: I1216 13:04:25.687646 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-gsqln" Dec 16 13:04:25 crc kubenswrapper[4757]: I1216 13:04:25.709339 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-gsqln"] Dec 16 13:04:25 crc kubenswrapper[4757]: I1216 13:04:25.718896 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea719a31-714a-4959-8a0d-77b7a1ae769f-config\") pod \"dnsmasq-dns-666b6646f7-gsqln\" (UID: \"ea719a31-714a-4959-8a0d-77b7a1ae769f\") " pod="openstack/dnsmasq-dns-666b6646f7-gsqln" Dec 16 13:04:25 crc kubenswrapper[4757]: I1216 13:04:25.718982 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea719a31-714a-4959-8a0d-77b7a1ae769f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-gsqln\" (UID: \"ea719a31-714a-4959-8a0d-77b7a1ae769f\") " pod="openstack/dnsmasq-dns-666b6646f7-gsqln" Dec 16 13:04:25 crc kubenswrapper[4757]: I1216 13:04:25.719067 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wslfx\" (UniqueName: \"kubernetes.io/projected/ea719a31-714a-4959-8a0d-77b7a1ae769f-kube-api-access-wslfx\") pod \"dnsmasq-dns-666b6646f7-gsqln\" (UID: \"ea719a31-714a-4959-8a0d-77b7a1ae769f\") " pod="openstack/dnsmasq-dns-666b6646f7-gsqln" Dec 16 13:04:25 crc kubenswrapper[4757]: I1216 13:04:25.827711 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wslfx\" (UniqueName: \"kubernetes.io/projected/ea719a31-714a-4959-8a0d-77b7a1ae769f-kube-api-access-wslfx\") pod \"dnsmasq-dns-666b6646f7-gsqln\" (UID: \"ea719a31-714a-4959-8a0d-77b7a1ae769f\") " pod="openstack/dnsmasq-dns-666b6646f7-gsqln" Dec 16 13:04:25 crc kubenswrapper[4757]: I1216 13:04:25.827781 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea719a31-714a-4959-8a0d-77b7a1ae769f-config\") pod \"dnsmasq-dns-666b6646f7-gsqln\" (UID: \"ea719a31-714a-4959-8a0d-77b7a1ae769f\") " pod="openstack/dnsmasq-dns-666b6646f7-gsqln" Dec 16 13:04:25 crc kubenswrapper[4757]: I1216 13:04:25.827848 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea719a31-714a-4959-8a0d-77b7a1ae769f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-gsqln\" (UID: \"ea719a31-714a-4959-8a0d-77b7a1ae769f\") " pod="openstack/dnsmasq-dns-666b6646f7-gsqln" Dec 16 13:04:25 crc kubenswrapper[4757]: I1216 13:04:25.829030 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea719a31-714a-4959-8a0d-77b7a1ae769f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-gsqln\" (UID: \"ea719a31-714a-4959-8a0d-77b7a1ae769f\") " pod="openstack/dnsmasq-dns-666b6646f7-gsqln" Dec 16 13:04:25 crc kubenswrapper[4757]: I1216 13:04:25.829974 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea719a31-714a-4959-8a0d-77b7a1ae769f-config\") pod \"dnsmasq-dns-666b6646f7-gsqln\" (UID: \"ea719a31-714a-4959-8a0d-77b7a1ae769f\") " pod="openstack/dnsmasq-dns-666b6646f7-gsqln" Dec 16 13:04:25 crc kubenswrapper[4757]: I1216 13:04:25.863077 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wslfx\" (UniqueName: \"kubernetes.io/projected/ea719a31-714a-4959-8a0d-77b7a1ae769f-kube-api-access-wslfx\") pod \"dnsmasq-dns-666b6646f7-gsqln\" (UID: \"ea719a31-714a-4959-8a0d-77b7a1ae769f\") " pod="openstack/dnsmasq-dns-666b6646f7-gsqln" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.028187 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-gsqln" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.204322 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bxfw6"] Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.264616 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t2b69"] Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.266044 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.293954 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t2b69"] Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.334798 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jbgn\" (UniqueName: \"kubernetes.io/projected/6aa77e81-cd27-4624-8604-684ae64ff3fb-kube-api-access-7jbgn\") pod \"dnsmasq-dns-57d769cc4f-t2b69\" (UID: \"6aa77e81-cd27-4624-8604-684ae64ff3fb\") " pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.334840 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aa77e81-cd27-4624-8604-684ae64ff3fb-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-t2b69\" (UID: \"6aa77e81-cd27-4624-8604-684ae64ff3fb\") " pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.334871 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa77e81-cd27-4624-8604-684ae64ff3fb-config\") pod \"dnsmasq-dns-57d769cc4f-t2b69\" (UID: \"6aa77e81-cd27-4624-8604-684ae64ff3fb\") " pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.438920 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jbgn\" (UniqueName: \"kubernetes.io/projected/6aa77e81-cd27-4624-8604-684ae64ff3fb-kube-api-access-7jbgn\") pod \"dnsmasq-dns-57d769cc4f-t2b69\" (UID: \"6aa77e81-cd27-4624-8604-684ae64ff3fb\") " pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.439404 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aa77e81-cd27-4624-8604-684ae64ff3fb-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-t2b69\" (UID: \"6aa77e81-cd27-4624-8604-684ae64ff3fb\") " pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.440319 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aa77e81-cd27-4624-8604-684ae64ff3fb-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-t2b69\" (UID: \"6aa77e81-cd27-4624-8604-684ae64ff3fb\") " pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.440442 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa77e81-cd27-4624-8604-684ae64ff3fb-config\") pod \"dnsmasq-dns-57d769cc4f-t2b69\" (UID: \"6aa77e81-cd27-4624-8604-684ae64ff3fb\") " pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.442412 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa77e81-cd27-4624-8604-684ae64ff3fb-config\") pod \"dnsmasq-dns-57d769cc4f-t2b69\" (UID: \"6aa77e81-cd27-4624-8604-684ae64ff3fb\") " pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.469820 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jbgn\" (UniqueName: \"kubernetes.io/projected/6aa77e81-cd27-4624-8604-684ae64ff3fb-kube-api-access-7jbgn\") pod \"dnsmasq-dns-57d769cc4f-t2b69\" (UID: \"6aa77e81-cd27-4624-8604-684ae64ff3fb\") " pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.588205 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.797257 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-gsqln"] Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.898208 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.899647 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.904586 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lkljn" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.911485 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.911705 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.911924 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.923932 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.924126 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.924230 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 16 13:04:26 crc kubenswrapper[4757]: I1216 13:04:26.947087 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.054890 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d0dd86b6-b617-44fa-aabc-f073e1df12ca-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.054993 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.055041 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.055061 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.055094 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.055166 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0dd86b6-b617-44fa-aabc-f073e1df12ca-config-data\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.055237 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d0dd86b6-b617-44fa-aabc-f073e1df12ca-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.055255 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d0dd86b6-b617-44fa-aabc-f073e1df12ca-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.055309 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d0dd86b6-b617-44fa-aabc-f073e1df12ca-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.055352 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.055428 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nhrm\" (UniqueName: \"kubernetes.io/projected/d0dd86b6-b617-44fa-aabc-f073e1df12ca-kube-api-access-9nhrm\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.156432 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nhrm\" (UniqueName: \"kubernetes.io/projected/d0dd86b6-b617-44fa-aabc-f073e1df12ca-kube-api-access-9nhrm\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.156490 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d0dd86b6-b617-44fa-aabc-f073e1df12ca-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.156532 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.156566 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.156596 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.156621 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.156645 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0dd86b6-b617-44fa-aabc-f073e1df12ca-config-data\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.156693 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d0dd86b6-b617-44fa-aabc-f073e1df12ca-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.156712 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d0dd86b6-b617-44fa-aabc-f073e1df12ca-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.156741 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d0dd86b6-b617-44fa-aabc-f073e1df12ca-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.156765 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.157075 4757 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.157316 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.158060 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.158821 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d0dd86b6-b617-44fa-aabc-f073e1df12ca-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.159160 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0dd86b6-b617-44fa-aabc-f073e1df12ca-config-data\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.159879 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d0dd86b6-b617-44fa-aabc-f073e1df12ca-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.161182 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d0dd86b6-b617-44fa-aabc-f073e1df12ca-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.162843 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d0dd86b6-b617-44fa-aabc-f073e1df12ca-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.163128 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.176315 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nhrm\" (UniqueName: \"kubernetes.io/projected/d0dd86b6-b617-44fa-aabc-f073e1df12ca-kube-api-access-9nhrm\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.179663 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.179682 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.264794 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t2b69"] Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.277482 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: W1216 13:04:27.347648 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6aa77e81_cd27_4624_8604_684ae64ff3fb.slice/crio-1fcd4511cc86d85869f89cef54bdc9dd0f559063bf47157ba4235092b934b64d WatchSource:0}: Error finding container 1fcd4511cc86d85869f89cef54bdc9dd0f559063bf47157ba4235092b934b64d: Status 404 returned error can't find the container with id 1fcd4511cc86d85869f89cef54bdc9dd0f559063bf47157ba4235092b934b64d Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.464756 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.467858 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.469897 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.471244 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.471535 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.471682 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-fmqf2" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.471826 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.471968 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.472150 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.472719 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.571216 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.571283 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38824624-9325-4515-ab97-157001f60385-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.571344 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38824624-9325-4515-ab97-157001f60385-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.571376 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38824624-9325-4515-ab97-157001f60385-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.571410 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38824624-9325-4515-ab97-157001f60385-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.571437 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38824624-9325-4515-ab97-157001f60385-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.571465 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38824624-9325-4515-ab97-157001f60385-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.571562 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhkdv\" (UniqueName: \"kubernetes.io/projected/38824624-9325-4515-ab97-157001f60385-kube-api-access-vhkdv\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.571619 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38824624-9325-4515-ab97-157001f60385-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.571648 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38824624-9325-4515-ab97-157001f60385-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.571697 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38824624-9325-4515-ab97-157001f60385-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.674972 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38824624-9325-4515-ab97-157001f60385-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.675339 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.675370 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38824624-9325-4515-ab97-157001f60385-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.675411 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38824624-9325-4515-ab97-157001f60385-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.675438 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38824624-9325-4515-ab97-157001f60385-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.675471 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38824624-9325-4515-ab97-157001f60385-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.675497 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38824624-9325-4515-ab97-157001f60385-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.675556 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38824624-9325-4515-ab97-157001f60385-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.675599 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhkdv\" (UniqueName: \"kubernetes.io/projected/38824624-9325-4515-ab97-157001f60385-kube-api-access-vhkdv\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.675626 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38824624-9325-4515-ab97-157001f60385-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.675672 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38824624-9325-4515-ab97-157001f60385-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.678757 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38824624-9325-4515-ab97-157001f60385-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.679085 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38824624-9325-4515-ab97-157001f60385-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.679167 4757 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.680633 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38824624-9325-4515-ab97-157001f60385-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.681233 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38824624-9325-4515-ab97-157001f60385-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.682525 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38824624-9325-4515-ab97-157001f60385-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.691962 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38824624-9325-4515-ab97-157001f60385-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.695939 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38824624-9325-4515-ab97-157001f60385-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.697263 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38824624-9325-4515-ab97-157001f60385-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.709041 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhkdv\" (UniqueName: \"kubernetes.io/projected/38824624-9325-4515-ab97-157001f60385-kube-api-access-vhkdv\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.722947 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38824624-9325-4515-ab97-157001f60385-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.733712 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.758562 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" event={"ID":"6aa77e81-cd27-4624-8604-684ae64ff3fb","Type":"ContainerStarted","Data":"1fcd4511cc86d85869f89cef54bdc9dd0f559063bf47157ba4235092b934b64d"} Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.761237 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-gsqln" event={"ID":"ea719a31-714a-4959-8a0d-77b7a1ae769f","Type":"ContainerStarted","Data":"b7b5c6469f14948f0b892091d5b49e4d4fa72ccdf614015fbca20f726ccd348a"} Dec 16 13:04:27 crc kubenswrapper[4757]: I1216 13:04:27.821867 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.004352 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 13:04:28 crc kubenswrapper[4757]: W1216 13:04:28.014371 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0dd86b6_b617_44fa_aabc_f073e1df12ca.slice/crio-53c816a9541180ed42b64598c79bff691f6773e5e6db0d1e0bac94f72d14763f WatchSource:0}: Error finding container 53c816a9541180ed42b64598c79bff691f6773e5e6db0d1e0bac94f72d14763f: Status 404 returned error can't find the container with id 53c816a9541180ed42b64598c79bff691f6773e5e6db0d1e0bac94f72d14763f Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.495315 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 13:04:28 crc kubenswrapper[4757]: W1216 13:04:28.499894 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38824624_9325_4515_ab97_157001f60385.slice/crio-88f607441a18fb63650f60bfb09f38014869e4c2c0c6187fc6d8dffa4de4aa91 WatchSource:0}: Error finding container 88f607441a18fb63650f60bfb09f38014869e4c2c0c6187fc6d8dffa4de4aa91: Status 404 returned error can't find the container with id 88f607441a18fb63650f60bfb09f38014869e4c2c0c6187fc6d8dffa4de4aa91 Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.670295 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.673852 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.680600 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.683916 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-wsp99" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.684214 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.706352 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.706710 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.713265 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.802734 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38824624-9325-4515-ab97-157001f60385","Type":"ContainerStarted","Data":"88f607441a18fb63650f60bfb09f38014869e4c2c0c6187fc6d8dffa4de4aa91"} Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.814224 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d0dd86b6-b617-44fa-aabc-f073e1df12ca","Type":"ContainerStarted","Data":"53c816a9541180ed42b64598c79bff691f6773e5e6db0d1e0bac94f72d14763f"} Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.817831 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2d16196-ea98-44e5-b859-bea9a8392c01-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.817885 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c2d16196-ea98-44e5-b859-bea9a8392c01-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.817928 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.817965 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c2d16196-ea98-44e5-b859-bea9a8392c01-config-data-default\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.817997 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d16196-ea98-44e5-b859-bea9a8392c01-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.818042 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2d16196-ea98-44e5-b859-bea9a8392c01-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.818078 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c2d16196-ea98-44e5-b859-bea9a8392c01-kolla-config\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.818111 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmd5h\" (UniqueName: \"kubernetes.io/projected/c2d16196-ea98-44e5-b859-bea9a8392c01-kube-api-access-kmd5h\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.919413 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c2d16196-ea98-44e5-b859-bea9a8392c01-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.920246 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2d16196-ea98-44e5-b859-bea9a8392c01-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.920365 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.920471 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c2d16196-ea98-44e5-b859-bea9a8392c01-config-data-default\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.920589 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d16196-ea98-44e5-b859-bea9a8392c01-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.920722 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2d16196-ea98-44e5-b859-bea9a8392c01-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.920831 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c2d16196-ea98-44e5-b859-bea9a8392c01-kolla-config\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.920945 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmd5h\" (UniqueName: \"kubernetes.io/projected/c2d16196-ea98-44e5-b859-bea9a8392c01-kube-api-access-kmd5h\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.922042 4757 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.923523 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c2d16196-ea98-44e5-b859-bea9a8392c01-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.927402 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c2d16196-ea98-44e5-b859-bea9a8392c01-config-data-default\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.927498 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2d16196-ea98-44e5-b859-bea9a8392c01-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.941963 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c2d16196-ea98-44e5-b859-bea9a8392c01-kolla-config\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.942807 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d16196-ea98-44e5-b859-bea9a8392c01-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.954295 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2d16196-ea98-44e5-b859-bea9a8392c01-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.959701 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmd5h\" (UniqueName: \"kubernetes.io/projected/c2d16196-ea98-44e5-b859-bea9a8392c01-kube-api-access-kmd5h\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:28 crc kubenswrapper[4757]: I1216 13:04:28.983998 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"c2d16196-ea98-44e5-b859-bea9a8392c01\") " pod="openstack/openstack-galera-0" Dec 16 13:04:29 crc kubenswrapper[4757]: I1216 13:04:29.040194 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 13:04:29 crc kubenswrapper[4757]: I1216 13:04:29.874239 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 13:04:29 crc kubenswrapper[4757]: W1216 13:04:29.892253 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2d16196_ea98_44e5_b859_bea9a8392c01.slice/crio-a60bc011b3944f3aec9469971ccbdaeebbd84ab4b354d78679a955b8e6439ddf WatchSource:0}: Error finding container a60bc011b3944f3aec9469971ccbdaeebbd84ab4b354d78679a955b8e6439ddf: Status 404 returned error can't find the container with id a60bc011b3944f3aec9469971ccbdaeebbd84ab4b354d78679a955b8e6439ddf Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.272283 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.273158 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.280764 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-lhw6q" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.281151 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.281309 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.285600 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.312127 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.313635 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.321030 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.321127 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.321365 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.331501 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-dsgj2" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.347910 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.385138 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f3e06047-7a9b-46ce-9021-a88b62993e3d-kolla-config\") pod \"memcached-0\" (UID: \"f3e06047-7a9b-46ce-9021-a88b62993e3d\") " pod="openstack/memcached-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.385188 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3e06047-7a9b-46ce-9021-a88b62993e3d-config-data\") pod \"memcached-0\" (UID: \"f3e06047-7a9b-46ce-9021-a88b62993e3d\") " pod="openstack/memcached-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.385208 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8ftp\" (UniqueName: \"kubernetes.io/projected/f3e06047-7a9b-46ce-9021-a88b62993e3d-kube-api-access-s8ftp\") pod \"memcached-0\" (UID: \"f3e06047-7a9b-46ce-9021-a88b62993e3d\") " pod="openstack/memcached-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.385239 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e06047-7a9b-46ce-9021-a88b62993e3d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f3e06047-7a9b-46ce-9021-a88b62993e3d\") " pod="openstack/memcached-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.385269 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e06047-7a9b-46ce-9021-a88b62993e3d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f3e06047-7a9b-46ce-9021-a88b62993e3d\") " pod="openstack/memcached-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.487759 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e06047-7a9b-46ce-9021-a88b62993e3d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f3e06047-7a9b-46ce-9021-a88b62993e3d\") " pod="openstack/memcached-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.487813 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/baba14f2-35db-422f-a583-724854b001d1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.487849 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8ggg\" (UniqueName: \"kubernetes.io/projected/baba14f2-35db-422f-a583-724854b001d1-kube-api-access-d8ggg\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.487889 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.487910 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baba14f2-35db-422f-a583-724854b001d1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.487929 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/baba14f2-35db-422f-a583-724854b001d1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.487949 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baba14f2-35db-422f-a583-724854b001d1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.487965 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f3e06047-7a9b-46ce-9021-a88b62993e3d-kolla-config\") pod \"memcached-0\" (UID: \"f3e06047-7a9b-46ce-9021-a88b62993e3d\") " pod="openstack/memcached-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.487989 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3e06047-7a9b-46ce-9021-a88b62993e3d-config-data\") pod \"memcached-0\" (UID: \"f3e06047-7a9b-46ce-9021-a88b62993e3d\") " pod="openstack/memcached-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.488015 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8ftp\" (UniqueName: \"kubernetes.io/projected/f3e06047-7a9b-46ce-9021-a88b62993e3d-kube-api-access-s8ftp\") pod \"memcached-0\" (UID: \"f3e06047-7a9b-46ce-9021-a88b62993e3d\") " pod="openstack/memcached-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.488035 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/baba14f2-35db-422f-a583-724854b001d1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.488063 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e06047-7a9b-46ce-9021-a88b62993e3d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f3e06047-7a9b-46ce-9021-a88b62993e3d\") " pod="openstack/memcached-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.488088 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/baba14f2-35db-422f-a583-724854b001d1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.493911 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3e06047-7a9b-46ce-9021-a88b62993e3d-config-data\") pod \"memcached-0\" (UID: \"f3e06047-7a9b-46ce-9021-a88b62993e3d\") " pod="openstack/memcached-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.502158 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e06047-7a9b-46ce-9021-a88b62993e3d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f3e06047-7a9b-46ce-9021-a88b62993e3d\") " pod="openstack/memcached-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.507913 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e06047-7a9b-46ce-9021-a88b62993e3d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f3e06047-7a9b-46ce-9021-a88b62993e3d\") " pod="openstack/memcached-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.493996 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f3e06047-7a9b-46ce-9021-a88b62993e3d-kolla-config\") pod \"memcached-0\" (UID: \"f3e06047-7a9b-46ce-9021-a88b62993e3d\") " pod="openstack/memcached-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.517140 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8ftp\" (UniqueName: \"kubernetes.io/projected/f3e06047-7a9b-46ce-9021-a88b62993e3d-kube-api-access-s8ftp\") pod \"memcached-0\" (UID: \"f3e06047-7a9b-46ce-9021-a88b62993e3d\") " pod="openstack/memcached-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.589058 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8ggg\" (UniqueName: \"kubernetes.io/projected/baba14f2-35db-422f-a583-724854b001d1-kube-api-access-d8ggg\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.589117 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.589139 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baba14f2-35db-422f-a583-724854b001d1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.589163 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/baba14f2-35db-422f-a583-724854b001d1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.589193 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baba14f2-35db-422f-a583-724854b001d1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.589236 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/baba14f2-35db-422f-a583-724854b001d1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.589284 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/baba14f2-35db-422f-a583-724854b001d1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.589328 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/baba14f2-35db-422f-a583-724854b001d1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.589729 4757 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.590384 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/baba14f2-35db-422f-a583-724854b001d1-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.590739 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baba14f2-35db-422f-a583-724854b001d1-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.590970 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/baba14f2-35db-422f-a583-724854b001d1-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.591318 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/baba14f2-35db-422f-a583-724854b001d1-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.600570 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/baba14f2-35db-422f-a583-724854b001d1-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.600965 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.602182 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baba14f2-35db-422f-a583-724854b001d1-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.626912 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8ggg\" (UniqueName: \"kubernetes.io/projected/baba14f2-35db-422f-a583-724854b001d1-kube-api-access-d8ggg\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.649313 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"baba14f2-35db-422f-a583-724854b001d1\") " pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.931097 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c2d16196-ea98-44e5-b859-bea9a8392c01","Type":"ContainerStarted","Data":"a60bc011b3944f3aec9469971ccbdaeebbd84ab4b354d78679a955b8e6439ddf"} Dec 16 13:04:30 crc kubenswrapper[4757]: I1216 13:04:30.935419 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 13:04:31 crc kubenswrapper[4757]: I1216 13:04:31.261565 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 16 13:04:31 crc kubenswrapper[4757]: I1216 13:04:31.696232 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 13:04:31 crc kubenswrapper[4757]: I1216 13:04:31.939706 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f3e06047-7a9b-46ce-9021-a88b62993e3d","Type":"ContainerStarted","Data":"e6dec37bbdeeda469b230153b550a430e2a78339ea07f2fa28f5ee747f19054c"} Dec 16 13:04:31 crc kubenswrapper[4757]: I1216 13:04:31.940493 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"baba14f2-35db-422f-a583-724854b001d1","Type":"ContainerStarted","Data":"5714b9e4d7cc413d64b09d9cbecf9bdea3f5afde87cc80bb6c88f87ae48d3df6"} Dec 16 13:04:32 crc kubenswrapper[4757]: I1216 13:04:32.389951 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 13:04:32 crc kubenswrapper[4757]: I1216 13:04:32.391224 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 13:04:32 crc kubenswrapper[4757]: I1216 13:04:32.394282 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-m6l87" Dec 16 13:04:32 crc kubenswrapper[4757]: I1216 13:04:32.412426 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 13:04:32 crc kubenswrapper[4757]: I1216 13:04:32.535167 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld2mn\" (UniqueName: \"kubernetes.io/projected/6cfaebf1-0d50-42e7-9f5a-94b0894a0a46-kube-api-access-ld2mn\") pod \"kube-state-metrics-0\" (UID: \"6cfaebf1-0d50-42e7-9f5a-94b0894a0a46\") " pod="openstack/kube-state-metrics-0" Dec 16 13:04:32 crc kubenswrapper[4757]: I1216 13:04:32.636289 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld2mn\" (UniqueName: \"kubernetes.io/projected/6cfaebf1-0d50-42e7-9f5a-94b0894a0a46-kube-api-access-ld2mn\") pod \"kube-state-metrics-0\" (UID: \"6cfaebf1-0d50-42e7-9f5a-94b0894a0a46\") " pod="openstack/kube-state-metrics-0" Dec 16 13:04:32 crc kubenswrapper[4757]: I1216 13:04:32.669454 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld2mn\" (UniqueName: \"kubernetes.io/projected/6cfaebf1-0d50-42e7-9f5a-94b0894a0a46-kube-api-access-ld2mn\") pod \"kube-state-metrics-0\" (UID: \"6cfaebf1-0d50-42e7-9f5a-94b0894a0a46\") " pod="openstack/kube-state-metrics-0" Dec 16 13:04:32 crc kubenswrapper[4757]: I1216 13:04:32.718861 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 13:04:33 crc kubenswrapper[4757]: I1216 13:04:33.492596 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.605833 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xjblp"] Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.608493 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.611545 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.612092 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-zkccd" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.612219 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.632112 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xjblp"] Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.695026 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824c8db6-764f-4062-85c5-3c0fcbe434ce-combined-ca-bundle\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.695108 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/824c8db6-764f-4062-85c5-3c0fcbe434ce-ovn-controller-tls-certs\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.695152 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/824c8db6-764f-4062-85c5-3c0fcbe434ce-var-run\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.695172 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/824c8db6-764f-4062-85c5-3c0fcbe434ce-scripts\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.695223 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/824c8db6-764f-4062-85c5-3c0fcbe434ce-var-run-ovn\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.695243 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/824c8db6-764f-4062-85c5-3c0fcbe434ce-var-log-ovn\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.695259 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf7sf\" (UniqueName: \"kubernetes.io/projected/824c8db6-764f-4062-85c5-3c0fcbe434ce-kube-api-access-hf7sf\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.701267 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-2mmpx"] Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.706529 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.719146 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2mmpx"] Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.797363 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/824c8db6-764f-4062-85c5-3c0fcbe434ce-scripts\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.801656 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/86aeade4-6bed-4d48-ab21-c43ac5b8c06b-scripts\") pod \"ovn-controller-ovs-2mmpx\" (UID: \"86aeade4-6bed-4d48-ab21-c43ac5b8c06b\") " pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.801762 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/86aeade4-6bed-4d48-ab21-c43ac5b8c06b-etc-ovs\") pod \"ovn-controller-ovs-2mmpx\" (UID: \"86aeade4-6bed-4d48-ab21-c43ac5b8c06b\") " pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.801780 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nchxf\" (UniqueName: \"kubernetes.io/projected/86aeade4-6bed-4d48-ab21-c43ac5b8c06b-kube-api-access-nchxf\") pod \"ovn-controller-ovs-2mmpx\" (UID: \"86aeade4-6bed-4d48-ab21-c43ac5b8c06b\") " pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.801877 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/824c8db6-764f-4062-85c5-3c0fcbe434ce-var-run-ovn\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.801905 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/824c8db6-764f-4062-85c5-3c0fcbe434ce-var-log-ovn\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.801925 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf7sf\" (UniqueName: \"kubernetes.io/projected/824c8db6-764f-4062-85c5-3c0fcbe434ce-kube-api-access-hf7sf\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.802000 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/86aeade4-6bed-4d48-ab21-c43ac5b8c06b-var-log\") pod \"ovn-controller-ovs-2mmpx\" (UID: \"86aeade4-6bed-4d48-ab21-c43ac5b8c06b\") " pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.802052 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824c8db6-764f-4062-85c5-3c0fcbe434ce-combined-ca-bundle\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.802141 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/86aeade4-6bed-4d48-ab21-c43ac5b8c06b-var-lib\") pod \"ovn-controller-ovs-2mmpx\" (UID: \"86aeade4-6bed-4d48-ab21-c43ac5b8c06b\") " pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.802665 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/824c8db6-764f-4062-85c5-3c0fcbe434ce-ovn-controller-tls-certs\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.802764 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/86aeade4-6bed-4d48-ab21-c43ac5b8c06b-var-run\") pod \"ovn-controller-ovs-2mmpx\" (UID: \"86aeade4-6bed-4d48-ab21-c43ac5b8c06b\") " pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.802802 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/824c8db6-764f-4062-85c5-3c0fcbe434ce-var-run\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.804181 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/824c8db6-764f-4062-85c5-3c0fcbe434ce-var-run-ovn\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.804474 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/824c8db6-764f-4062-85c5-3c0fcbe434ce-var-log-ovn\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.804817 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/824c8db6-764f-4062-85c5-3c0fcbe434ce-var-run\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.804899 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/824c8db6-764f-4062-85c5-3c0fcbe434ce-scripts\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.809325 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/824c8db6-764f-4062-85c5-3c0fcbe434ce-ovn-controller-tls-certs\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.821134 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824c8db6-764f-4062-85c5-3c0fcbe434ce-combined-ca-bundle\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.829569 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf7sf\" (UniqueName: \"kubernetes.io/projected/824c8db6-764f-4062-85c5-3c0fcbe434ce-kube-api-access-hf7sf\") pod \"ovn-controller-xjblp\" (UID: \"824c8db6-764f-4062-85c5-3c0fcbe434ce\") " pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.904200 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/86aeade4-6bed-4d48-ab21-c43ac5b8c06b-var-lib\") pod \"ovn-controller-ovs-2mmpx\" (UID: \"86aeade4-6bed-4d48-ab21-c43ac5b8c06b\") " pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.904251 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/86aeade4-6bed-4d48-ab21-c43ac5b8c06b-var-run\") pod \"ovn-controller-ovs-2mmpx\" (UID: \"86aeade4-6bed-4d48-ab21-c43ac5b8c06b\") " pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.904278 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/86aeade4-6bed-4d48-ab21-c43ac5b8c06b-scripts\") pod \"ovn-controller-ovs-2mmpx\" (UID: \"86aeade4-6bed-4d48-ab21-c43ac5b8c06b\") " pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.904307 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/86aeade4-6bed-4d48-ab21-c43ac5b8c06b-etc-ovs\") pod \"ovn-controller-ovs-2mmpx\" (UID: \"86aeade4-6bed-4d48-ab21-c43ac5b8c06b\") " pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.904339 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nchxf\" (UniqueName: \"kubernetes.io/projected/86aeade4-6bed-4d48-ab21-c43ac5b8c06b-kube-api-access-nchxf\") pod \"ovn-controller-ovs-2mmpx\" (UID: \"86aeade4-6bed-4d48-ab21-c43ac5b8c06b\") " pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.904429 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/86aeade4-6bed-4d48-ab21-c43ac5b8c06b-var-log\") pod \"ovn-controller-ovs-2mmpx\" (UID: \"86aeade4-6bed-4d48-ab21-c43ac5b8c06b\") " pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.904594 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/86aeade4-6bed-4d48-ab21-c43ac5b8c06b-var-log\") pod \"ovn-controller-ovs-2mmpx\" (UID: \"86aeade4-6bed-4d48-ab21-c43ac5b8c06b\") " pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.904637 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/86aeade4-6bed-4d48-ab21-c43ac5b8c06b-var-lib\") pod \"ovn-controller-ovs-2mmpx\" (UID: \"86aeade4-6bed-4d48-ab21-c43ac5b8c06b\") " pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.904718 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/86aeade4-6bed-4d48-ab21-c43ac5b8c06b-var-run\") pod \"ovn-controller-ovs-2mmpx\" (UID: \"86aeade4-6bed-4d48-ab21-c43ac5b8c06b\") " pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.904780 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/86aeade4-6bed-4d48-ab21-c43ac5b8c06b-etc-ovs\") pod \"ovn-controller-ovs-2mmpx\" (UID: \"86aeade4-6bed-4d48-ab21-c43ac5b8c06b\") " pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.906265 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/86aeade4-6bed-4d48-ab21-c43ac5b8c06b-scripts\") pod \"ovn-controller-ovs-2mmpx\" (UID: \"86aeade4-6bed-4d48-ab21-c43ac5b8c06b\") " pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.950424 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xjblp" Dec 16 13:04:35 crc kubenswrapper[4757]: I1216 13:04:35.951764 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nchxf\" (UniqueName: \"kubernetes.io/projected/86aeade4-6bed-4d48-ab21-c43ac5b8c06b-kube-api-access-nchxf\") pod \"ovn-controller-ovs-2mmpx\" (UID: \"86aeade4-6bed-4d48-ab21-c43ac5b8c06b\") " pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:36 crc kubenswrapper[4757]: I1216 13:04:36.031276 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.190501 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.193215 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.211558 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.212091 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.212376 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-4mfz7" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.225245 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.226208 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.349367 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.410764 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.410813 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-config\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.410841 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.410868 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkwmw\" (UniqueName: \"kubernetes.io/projected/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-kube-api-access-dkwmw\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.410914 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.410948 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.410966 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.410989 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.512826 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.512892 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.512928 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.512953 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.512986 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.513016 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-config\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.513042 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.513063 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkwmw\" (UniqueName: \"kubernetes.io/projected/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-kube-api-access-dkwmw\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.513789 4757 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.514085 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-config\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.514358 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.514810 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.518282 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.538833 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.546291 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.555037 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.565444 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkwmw\" (UniqueName: \"kubernetes.io/projected/89cc68a0-15fd-4a20-bd71-9c8acb5a92c7-kube-api-access-dkwmw\") pod \"ovsdbserver-nb-0\" (UID: \"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7\") " pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:37 crc kubenswrapper[4757]: I1216 13:04:37.842428 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.083718 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.084873 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.090223 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-l8zcn" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.090432 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.090640 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.090749 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.098381 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.248571 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/972a26d6-4f3b-4fc4-8e86-055dfe33652a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.248622 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgm66\" (UniqueName: \"kubernetes.io/projected/972a26d6-4f3b-4fc4-8e86-055dfe33652a-kube-api-access-vgm66\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.248644 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/972a26d6-4f3b-4fc4-8e86-055dfe33652a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.248668 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/972a26d6-4f3b-4fc4-8e86-055dfe33652a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.248687 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.248728 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/972a26d6-4f3b-4fc4-8e86-055dfe33652a-config\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.248933 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972a26d6-4f3b-4fc4-8e86-055dfe33652a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.249061 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/972a26d6-4f3b-4fc4-8e86-055dfe33652a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.351534 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/972a26d6-4f3b-4fc4-8e86-055dfe33652a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.351582 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/972a26d6-4f3b-4fc4-8e86-055dfe33652a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.352197 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/972a26d6-4f3b-4fc4-8e86-055dfe33652a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.352267 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgm66\" (UniqueName: \"kubernetes.io/projected/972a26d6-4f3b-4fc4-8e86-055dfe33652a-kube-api-access-vgm66\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.352290 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/972a26d6-4f3b-4fc4-8e86-055dfe33652a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.352333 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/972a26d6-4f3b-4fc4-8e86-055dfe33652a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.352356 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.352396 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/972a26d6-4f3b-4fc4-8e86-055dfe33652a-config\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.352424 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972a26d6-4f3b-4fc4-8e86-055dfe33652a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.353611 4757 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.354057 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/972a26d6-4f3b-4fc4-8e86-055dfe33652a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.354292 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/972a26d6-4f3b-4fc4-8e86-055dfe33652a-config\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.373886 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972a26d6-4f3b-4fc4-8e86-055dfe33652a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.378254 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/972a26d6-4f3b-4fc4-8e86-055dfe33652a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.378537 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgm66\" (UniqueName: \"kubernetes.io/projected/972a26d6-4f3b-4fc4-8e86-055dfe33652a-kube-api-access-vgm66\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.393738 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/972a26d6-4f3b-4fc4-8e86-055dfe33652a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.409296 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"972a26d6-4f3b-4fc4-8e86-055dfe33652a\") " pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:39 crc kubenswrapper[4757]: I1216 13:04:39.707806 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 13:04:41 crc kubenswrapper[4757]: W1216 13:04:41.820093 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cfaebf1_0d50_42e7_9f5a_94b0894a0a46.slice/crio-10e4e6476d546d460f7758c5727f1409ad4ea39efb50e3ad4e9d65830d8d1d77 WatchSource:0}: Error finding container 10e4e6476d546d460f7758c5727f1409ad4ea39efb50e3ad4e9d65830d8d1d77: Status 404 returned error can't find the container with id 10e4e6476d546d460f7758c5727f1409ad4ea39efb50e3ad4e9d65830d8d1d77 Dec 16 13:04:42 crc kubenswrapper[4757]: I1216 13:04:42.096377 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6cfaebf1-0d50-42e7-9f5a-94b0894a0a46","Type":"ContainerStarted","Data":"10e4e6476d546d460f7758c5727f1409ad4ea39efb50e3ad4e9d65830d8d1d77"} Dec 16 13:04:51 crc kubenswrapper[4757]: I1216 13:04:51.181225 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:04:51 crc kubenswrapper[4757]: I1216 13:04:51.181908 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:04:52 crc kubenswrapper[4757]: E1216 13:04:52.024418 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 16 13:04:52 crc kubenswrapper[4757]: E1216 13:04:52.024880 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d8ggg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(baba14f2-35db-422f-a583-724854b001d1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:04:52 crc kubenswrapper[4757]: E1216 13:04:52.026105 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="baba14f2-35db-422f-a583-724854b001d1" Dec 16 13:04:52 crc kubenswrapper[4757]: E1216 13:04:52.173624 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="baba14f2-35db-422f-a583-724854b001d1" Dec 16 13:04:52 crc kubenswrapper[4757]: E1216 13:04:52.256944 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 16 13:04:52 crc kubenswrapper[4757]: E1216 13:04:52.257103 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kmd5h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(c2d16196-ea98-44e5-b859-bea9a8392c01): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:04:52 crc kubenswrapper[4757]: E1216 13:04:52.258540 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="c2d16196-ea98-44e5-b859-bea9a8392c01" Dec 16 13:04:53 crc kubenswrapper[4757]: E1216 13:04:53.177146 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="c2d16196-ea98-44e5-b859-bea9a8392c01" Dec 16 13:05:02 crc kubenswrapper[4757]: E1216 13:05:02.066611 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 16 13:05:02 crc kubenswrapper[4757]: E1216 13:05:02.067392 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9nhrm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(d0dd86b6-b617-44fa-aabc-f073e1df12ca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:05:02 crc kubenswrapper[4757]: E1216 13:05:02.073152 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="d0dd86b6-b617-44fa-aabc-f073e1df12ca" Dec 16 13:05:02 crc kubenswrapper[4757]: E1216 13:05:02.233596 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="d0dd86b6-b617-44fa-aabc-f073e1df12ca" Dec 16 13:05:03 crc kubenswrapper[4757]: E1216 13:05:03.726500 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Dec 16 13:05:03 crc kubenswrapper[4757]: E1216 13:05:03.726710 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n58fhc8h66fh5f5h678h9fh54ch557hdch545hd9h545h584h67dh6dh598h67fh59fh99h64bh668hddh5fch599h667hcbh86h5bh66ch5bh567hb9q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s8ftp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(f3e06047-7a9b-46ce-9021-a88b62993e3d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:05:03 crc kubenswrapper[4757]: E1216 13:05:03.727949 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="f3e06047-7a9b-46ce-9021-a88b62993e3d" Dec 16 13:05:04 crc kubenswrapper[4757]: E1216 13:05:04.248085 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="f3e06047-7a9b-46ce-9021-a88b62993e3d" Dec 16 13:05:04 crc kubenswrapper[4757]: E1216 13:05:04.859588 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 16 13:05:04 crc kubenswrapper[4757]: E1216 13:05:04.860090 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wslfx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-gsqln_openstack(ea719a31-714a-4959-8a0d-77b7a1ae769f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:05:04 crc kubenswrapper[4757]: E1216 13:05:04.861292 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-gsqln" podUID="ea719a31-714a-4959-8a0d-77b7a1ae769f" Dec 16 13:05:05 crc kubenswrapper[4757]: E1216 13:05:05.254161 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-gsqln" podUID="ea719a31-714a-4959-8a0d-77b7a1ae769f" Dec 16 13:05:05 crc kubenswrapper[4757]: I1216 13:05:05.404222 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xjblp"] Dec 16 13:05:05 crc kubenswrapper[4757]: W1216 13:05:05.499741 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod824c8db6_764f_4062_85c5_3c0fcbe434ce.slice/crio-7330b84f1957463e89f8b038a5d0f7415015136b006a71bd03cb3f2cb178b9a9 WatchSource:0}: Error finding container 7330b84f1957463e89f8b038a5d0f7415015136b006a71bd03cb3f2cb178b9a9: Status 404 returned error can't find the container with id 7330b84f1957463e89f8b038a5d0f7415015136b006a71bd03cb3f2cb178b9a9 Dec 16 13:05:05 crc kubenswrapper[4757]: I1216 13:05:05.808144 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 13:05:05 crc kubenswrapper[4757]: W1216 13:05:05.812885 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod972a26d6_4f3b_4fc4_8e86_055dfe33652a.slice/crio-e67046c885384250db545f3616b29e454414e71dcd68508c2c2baf41f58c4f5b WatchSource:0}: Error finding container e67046c885384250db545f3616b29e454414e71dcd68508c2c2baf41f58c4f5b: Status 404 returned error can't find the container with id e67046c885384250db545f3616b29e454414e71dcd68508c2c2baf41f58c4f5b Dec 16 13:05:06 crc kubenswrapper[4757]: I1216 13:05:06.258742 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38824624-9325-4515-ab97-157001f60385","Type":"ContainerStarted","Data":"0e85736b68697cab324c82628435878c14a0dc9fd56231019ad728633de68d34"} Dec 16 13:05:06 crc kubenswrapper[4757]: I1216 13:05:06.262486 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xjblp" event={"ID":"824c8db6-764f-4062-85c5-3c0fcbe434ce","Type":"ContainerStarted","Data":"7330b84f1957463e89f8b038a5d0f7415015136b006a71bd03cb3f2cb178b9a9"} Dec 16 13:05:06 crc kubenswrapper[4757]: I1216 13:05:06.263865 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"972a26d6-4f3b-4fc4-8e86-055dfe33652a","Type":"ContainerStarted","Data":"e67046c885384250db545f3616b29e454414e71dcd68508c2c2baf41f58c4f5b"} Dec 16 13:05:06 crc kubenswrapper[4757]: I1216 13:05:06.547360 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 13:05:06 crc kubenswrapper[4757]: I1216 13:05:06.880311 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2mmpx"] Dec 16 13:05:07 crc kubenswrapper[4757]: I1216 13:05:07.271790 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2mmpx" event={"ID":"86aeade4-6bed-4d48-ab21-c43ac5b8c06b","Type":"ContainerStarted","Data":"eeb66e1c1b74b5a87468d03c0f33d185e136e5ddc2e835bca9b68aa480b0da58"} Dec 16 13:05:07 crc kubenswrapper[4757]: I1216 13:05:07.273619 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7","Type":"ContainerStarted","Data":"9a7be19582c376e11496088bb05bbf5b8deeb26fb1d3c0e92f250ea0afdced5a"} Dec 16 13:05:07 crc kubenswrapper[4757]: E1216 13:05:07.799133 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 16 13:05:07 crc kubenswrapper[4757]: E1216 13:05:07.799624 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mqv2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-bxfw6_openstack(0566503d-6ae6-445d-b934-1910ca733474): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:05:07 crc kubenswrapper[4757]: E1216 13:05:07.800834 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-bxfw6" podUID="0566503d-6ae6-445d-b934-1910ca733474" Dec 16 13:05:08 crc kubenswrapper[4757]: E1216 13:05:08.328216 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 16 13:05:08 crc kubenswrapper[4757]: E1216 13:05:08.328403 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8vqcn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-dfc6j_openstack(6cd3cbd0-b882-434b-b6ae-62e6bc9831be): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:05:08 crc kubenswrapper[4757]: E1216 13:05:08.330450 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-dfc6j" podUID="6cd3cbd0-b882-434b-b6ae-62e6bc9831be" Dec 16 13:05:08 crc kubenswrapper[4757]: I1216 13:05:08.550901 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bxfw6" Dec 16 13:05:08 crc kubenswrapper[4757]: I1216 13:05:08.703844 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqv2d\" (UniqueName: \"kubernetes.io/projected/0566503d-6ae6-445d-b934-1910ca733474-kube-api-access-mqv2d\") pod \"0566503d-6ae6-445d-b934-1910ca733474\" (UID: \"0566503d-6ae6-445d-b934-1910ca733474\") " Dec 16 13:05:08 crc kubenswrapper[4757]: I1216 13:05:08.704029 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0566503d-6ae6-445d-b934-1910ca733474-dns-svc\") pod \"0566503d-6ae6-445d-b934-1910ca733474\" (UID: \"0566503d-6ae6-445d-b934-1910ca733474\") " Dec 16 13:05:08 crc kubenswrapper[4757]: I1216 13:05:08.704135 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0566503d-6ae6-445d-b934-1910ca733474-config\") pod \"0566503d-6ae6-445d-b934-1910ca733474\" (UID: \"0566503d-6ae6-445d-b934-1910ca733474\") " Dec 16 13:05:08 crc kubenswrapper[4757]: I1216 13:05:08.704783 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0566503d-6ae6-445d-b934-1910ca733474-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0566503d-6ae6-445d-b934-1910ca733474" (UID: "0566503d-6ae6-445d-b934-1910ca733474"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:08 crc kubenswrapper[4757]: I1216 13:05:08.705239 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0566503d-6ae6-445d-b934-1910ca733474-config" (OuterVolumeSpecName: "config") pod "0566503d-6ae6-445d-b934-1910ca733474" (UID: "0566503d-6ae6-445d-b934-1910ca733474"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:08 crc kubenswrapper[4757]: I1216 13:05:08.710110 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0566503d-6ae6-445d-b934-1910ca733474-kube-api-access-mqv2d" (OuterVolumeSpecName: "kube-api-access-mqv2d") pod "0566503d-6ae6-445d-b934-1910ca733474" (UID: "0566503d-6ae6-445d-b934-1910ca733474"). InnerVolumeSpecName "kube-api-access-mqv2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:05:08 crc kubenswrapper[4757]: I1216 13:05:08.805722 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqv2d\" (UniqueName: \"kubernetes.io/projected/0566503d-6ae6-445d-b934-1910ca733474-kube-api-access-mqv2d\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:08 crc kubenswrapper[4757]: I1216 13:05:08.805773 4757 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0566503d-6ae6-445d-b934-1910ca733474-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:08 crc kubenswrapper[4757]: I1216 13:05:08.805787 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0566503d-6ae6-445d-b934-1910ca733474-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:09 crc kubenswrapper[4757]: I1216 13:05:09.288556 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bxfw6" Dec 16 13:05:09 crc kubenswrapper[4757]: I1216 13:05:09.288607 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-bxfw6" event={"ID":"0566503d-6ae6-445d-b934-1910ca733474","Type":"ContainerDied","Data":"51757cb1c7a6375fe9ac898eae7b1cccdc7d8efe3f65eb365fb7a23bf84962cb"} Dec 16 13:05:09 crc kubenswrapper[4757]: I1216 13:05:09.357161 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bxfw6"] Dec 16 13:05:09 crc kubenswrapper[4757]: I1216 13:05:09.365904 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bxfw6"] Dec 16 13:05:09 crc kubenswrapper[4757]: I1216 13:05:09.606843 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dfc6j" Dec 16 13:05:09 crc kubenswrapper[4757]: I1216 13:05:09.734022 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd3cbd0-b882-434b-b6ae-62e6bc9831be-config\") pod \"6cd3cbd0-b882-434b-b6ae-62e6bc9831be\" (UID: \"6cd3cbd0-b882-434b-b6ae-62e6bc9831be\") " Dec 16 13:05:09 crc kubenswrapper[4757]: I1216 13:05:09.734211 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vqcn\" (UniqueName: \"kubernetes.io/projected/6cd3cbd0-b882-434b-b6ae-62e6bc9831be-kube-api-access-8vqcn\") pod \"6cd3cbd0-b882-434b-b6ae-62e6bc9831be\" (UID: \"6cd3cbd0-b882-434b-b6ae-62e6bc9831be\") " Dec 16 13:05:09 crc kubenswrapper[4757]: I1216 13:05:09.734586 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cd3cbd0-b882-434b-b6ae-62e6bc9831be-config" (OuterVolumeSpecName: "config") pod "6cd3cbd0-b882-434b-b6ae-62e6bc9831be" (UID: "6cd3cbd0-b882-434b-b6ae-62e6bc9831be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:09 crc kubenswrapper[4757]: I1216 13:05:09.739319 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cd3cbd0-b882-434b-b6ae-62e6bc9831be-kube-api-access-8vqcn" (OuterVolumeSpecName: "kube-api-access-8vqcn") pod "6cd3cbd0-b882-434b-b6ae-62e6bc9831be" (UID: "6cd3cbd0-b882-434b-b6ae-62e6bc9831be"). InnerVolumeSpecName "kube-api-access-8vqcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:05:09 crc kubenswrapper[4757]: I1216 13:05:09.835836 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vqcn\" (UniqueName: \"kubernetes.io/projected/6cd3cbd0-b882-434b-b6ae-62e6bc9831be-kube-api-access-8vqcn\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:09 crc kubenswrapper[4757]: I1216 13:05:09.835875 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd3cbd0-b882-434b-b6ae-62e6bc9831be-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:09 crc kubenswrapper[4757]: E1216 13:05:09.903766 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 16 13:05:09 crc kubenswrapper[4757]: E1216 13:05:09.904203 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7jbgn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-t2b69_openstack(6aa77e81-cd27-4624-8604-684ae64ff3fb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:05:09 crc kubenswrapper[4757]: E1216 13:05:09.905625 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" podUID="6aa77e81-cd27-4624-8604-684ae64ff3fb" Dec 16 13:05:10 crc kubenswrapper[4757]: I1216 13:05:10.297869 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dfc6j" Dec 16 13:05:10 crc kubenswrapper[4757]: I1216 13:05:10.304111 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dfc6j" event={"ID":"6cd3cbd0-b882-434b-b6ae-62e6bc9831be","Type":"ContainerDied","Data":"8127aff482668694557b562dcc189024186e4763dfa0a0aefccd8442e46633b2"} Dec 16 13:05:10 crc kubenswrapper[4757]: E1216 13:05:10.305132 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" podUID="6aa77e81-cd27-4624-8604-684ae64ff3fb" Dec 16 13:05:10 crc kubenswrapper[4757]: I1216 13:05:10.359394 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dfc6j"] Dec 16 13:05:10 crc kubenswrapper[4757]: I1216 13:05:10.369046 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dfc6j"] Dec 16 13:05:10 crc kubenswrapper[4757]: I1216 13:05:10.961331 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0566503d-6ae6-445d-b934-1910ca733474" path="/var/lib/kubelet/pods/0566503d-6ae6-445d-b934-1910ca733474/volumes" Dec 16 13:05:10 crc kubenswrapper[4757]: I1216 13:05:10.961973 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cd3cbd0-b882-434b-b6ae-62e6bc9831be" path="/var/lib/kubelet/pods/6cd3cbd0-b882-434b-b6ae-62e6bc9831be/volumes" Dec 16 13:05:21 crc kubenswrapper[4757]: I1216 13:05:21.181290 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:05:21 crc kubenswrapper[4757]: I1216 13:05:21.181870 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:05:21 crc kubenswrapper[4757]: I1216 13:05:21.181930 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 13:05:21 crc kubenswrapper[4757]: I1216 13:05:21.182650 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3d5810574004acc14ba78a28c621226c4b91fbf94cdc448f59cddf05b4cabae"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 13:05:21 crc kubenswrapper[4757]: I1216 13:05:21.182722 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://a3d5810574004acc14ba78a28c621226c4b91fbf94cdc448f59cddf05b4cabae" gracePeriod=600 Dec 16 13:05:21 crc kubenswrapper[4757]: I1216 13:05:21.401498 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="a3d5810574004acc14ba78a28c621226c4b91fbf94cdc448f59cddf05b4cabae" exitCode=0 Dec 16 13:05:21 crc kubenswrapper[4757]: I1216 13:05:21.401546 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"a3d5810574004acc14ba78a28c621226c4b91fbf94cdc448f59cddf05b4cabae"} Dec 16 13:05:21 crc kubenswrapper[4757]: I1216 13:05:21.401597 4757 scope.go:117] "RemoveContainer" containerID="2a23f8e521631b063ae4952d912ce6130192fc2c50ebd364a75b084a90f4b969" Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.412711 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"feaab26a71eb3b6535920da4cbeacb812adf972fd9cda852a626b3b8fac4ff4e"} Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.421685 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7","Type":"ContainerStarted","Data":"bbf0d65758a67170b31925df8b21c14cfa656002c25f7d3c2f89dbd7bdd18030"} Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.424085 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c2d16196-ea98-44e5-b859-bea9a8392c01","Type":"ContainerStarted","Data":"df3105c990c47a49609e9adcd8857e573f4ad6c7dcbb47cfe0aaf38e0a122a72"} Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.438602 4757 generic.go:334] "Generic (PLEG): container finished" podID="ea719a31-714a-4959-8a0d-77b7a1ae769f" containerID="59f989ed12dedade36d1b5a373f1df8081947fcd57b646091aef62874a48603b" exitCode=0 Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.438667 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-gsqln" event={"ID":"ea719a31-714a-4959-8a0d-77b7a1ae769f","Type":"ContainerDied","Data":"59f989ed12dedade36d1b5a373f1df8081947fcd57b646091aef62874a48603b"} Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.445182 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d0dd86b6-b617-44fa-aabc-f073e1df12ca","Type":"ContainerStarted","Data":"8719fe889cba7bf8bb1f4102e84c7999b73788e4f5119eaa39f5b154a8014058"} Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.453537 4757 generic.go:334] "Generic (PLEG): container finished" podID="86aeade4-6bed-4d48-ab21-c43ac5b8c06b" containerID="5f52e8db40d6c77ae2739369b1437a093c9af2592e5adbedbf98f3eb93d93886" exitCode=0 Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.453626 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2mmpx" event={"ID":"86aeade4-6bed-4d48-ab21-c43ac5b8c06b","Type":"ContainerDied","Data":"5f52e8db40d6c77ae2739369b1437a093c9af2592e5adbedbf98f3eb93d93886"} Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.461322 4757 generic.go:334] "Generic (PLEG): container finished" podID="6aa77e81-cd27-4624-8604-684ae64ff3fb" containerID="23bfabcc33fc2a581044ce0768ba27dd66aec0118a67ba1d0d64a53d998445f8" exitCode=0 Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.461415 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" event={"ID":"6aa77e81-cd27-4624-8604-684ae64ff3fb","Type":"ContainerDied","Data":"23bfabcc33fc2a581044ce0768ba27dd66aec0118a67ba1d0d64a53d998445f8"} Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.464729 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6cfaebf1-0d50-42e7-9f5a-94b0894a0a46","Type":"ContainerStarted","Data":"03d15000b00351d5d25b7208264e452cfc388862e89f1407d069b9b70859c816"} Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.465786 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.482337 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"baba14f2-35db-422f-a583-724854b001d1","Type":"ContainerStarted","Data":"057e65bd707580ee4bd8c43b5932acd957840860bfc1eb42778610fa48c14f3b"} Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.519295 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xjblp" event={"ID":"824c8db6-764f-4062-85c5-3c0fcbe434ce","Type":"ContainerStarted","Data":"0b334ef04d1af5158e09ff3985db3d7b48dd8b59d34fe973381190624877e870"} Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.519911 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-xjblp" Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.572186 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"972a26d6-4f3b-4fc4-8e86-055dfe33652a","Type":"ContainerStarted","Data":"7da0cab0abe1ead19771d833154f026db5651194b4fbb051c986a82cde311e90"} Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.605044 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f3e06047-7a9b-46ce-9021-a88b62993e3d","Type":"ContainerStarted","Data":"f09362cf4923e150053eb0d422f58f746fbc6ec149a963ba7be60d079940342f"} Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.605684 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.635469 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.450558613 podStartE2EDuration="50.635446778s" podCreationTimestamp="2025-12-16 13:04:32 +0000 UTC" firstStartedPulling="2025-12-16 13:04:41.838422547 +0000 UTC m=+1067.266166343" lastFinishedPulling="2025-12-16 13:05:21.023310712 +0000 UTC m=+1106.451054508" observedRunningTime="2025-12-16 13:05:22.599796717 +0000 UTC m=+1108.027540513" watchObservedRunningTime="2025-12-16 13:05:22.635446778 +0000 UTC m=+1108.063190574" Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.652936 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-xjblp" podStartSLOduration=32.12068847 podStartE2EDuration="47.652913939s" podCreationTimestamp="2025-12-16 13:04:35 +0000 UTC" firstStartedPulling="2025-12-16 13:05:05.50242408 +0000 UTC m=+1090.930167886" lastFinishedPulling="2025-12-16 13:05:21.034649559 +0000 UTC m=+1106.462393355" observedRunningTime="2025-12-16 13:05:22.622676507 +0000 UTC m=+1108.050420303" watchObservedRunningTime="2025-12-16 13:05:22.652913939 +0000 UTC m=+1108.080657735" Dec 16 13:05:22 crc kubenswrapper[4757]: I1216 13:05:22.668752 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.881771234 podStartE2EDuration="52.668728993s" podCreationTimestamp="2025-12-16 13:04:30 +0000 UTC" firstStartedPulling="2025-12-16 13:04:31.260199305 +0000 UTC m=+1056.687943101" lastFinishedPulling="2025-12-16 13:05:21.047157064 +0000 UTC m=+1106.474900860" observedRunningTime="2025-12-16 13:05:22.665388173 +0000 UTC m=+1108.093131969" watchObservedRunningTime="2025-12-16 13:05:22.668728993 +0000 UTC m=+1108.096472789" Dec 16 13:05:23 crc kubenswrapper[4757]: I1216 13:05:23.623460 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2mmpx" event={"ID":"86aeade4-6bed-4d48-ab21-c43ac5b8c06b","Type":"ContainerStarted","Data":"ecb74179c99f0b396e9fca3f3d39b76da7b45a53e309cfb605aa98c836b8cf99"} Dec 16 13:05:23 crc kubenswrapper[4757]: I1216 13:05:23.623964 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2mmpx" event={"ID":"86aeade4-6bed-4d48-ab21-c43ac5b8c06b","Type":"ContainerStarted","Data":"85d208dda47dbc9e24818aecfd3cbf87dba0946762a94b415fdd4b6e039b28a9"} Dec 16 13:05:23 crc kubenswrapper[4757]: I1216 13:05:23.624094 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:05:23 crc kubenswrapper[4757]: I1216 13:05:23.624126 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:05:23 crc kubenswrapper[4757]: I1216 13:05:23.633355 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" event={"ID":"6aa77e81-cd27-4624-8604-684ae64ff3fb","Type":"ContainerStarted","Data":"ceb873f2e6987ccceac06da817cb071eb58236fc67ec265bbc5ef864a4b83bcb"} Dec 16 13:05:23 crc kubenswrapper[4757]: I1216 13:05:23.633736 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" Dec 16 13:05:23 crc kubenswrapper[4757]: I1216 13:05:23.640225 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-gsqln" event={"ID":"ea719a31-714a-4959-8a0d-77b7a1ae769f","Type":"ContainerStarted","Data":"95820b5bba46e3e4414e117d06a2c978b2ce04b17dd4ce8e9a3f2991822228db"} Dec 16 13:05:23 crc kubenswrapper[4757]: I1216 13:05:23.656623 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-2mmpx" podStartSLOduration=34.643016577 podStartE2EDuration="48.656599905s" podCreationTimestamp="2025-12-16 13:04:35 +0000 UTC" firstStartedPulling="2025-12-16 13:05:06.892361519 +0000 UTC m=+1092.320105315" lastFinishedPulling="2025-12-16 13:05:20.905944847 +0000 UTC m=+1106.333688643" observedRunningTime="2025-12-16 13:05:23.643928116 +0000 UTC m=+1109.071671912" watchObservedRunningTime="2025-12-16 13:05:23.656599905 +0000 UTC m=+1109.084343711" Dec 16 13:05:23 crc kubenswrapper[4757]: I1216 13:05:23.665552 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-gsqln" podStartSLOduration=3.568152816 podStartE2EDuration="58.665536335s" podCreationTimestamp="2025-12-16 13:04:25 +0000 UTC" firstStartedPulling="2025-12-16 13:04:26.819452093 +0000 UTC m=+1052.247195879" lastFinishedPulling="2025-12-16 13:05:21.916835602 +0000 UTC m=+1107.344579398" observedRunningTime="2025-12-16 13:05:23.661245544 +0000 UTC m=+1109.088989340" watchObservedRunningTime="2025-12-16 13:05:23.665536335 +0000 UTC m=+1109.093280121" Dec 16 13:05:23 crc kubenswrapper[4757]: I1216 13:05:23.680154 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" podStartSLOduration=3.506640686 podStartE2EDuration="57.6801362s" podCreationTimestamp="2025-12-16 13:04:26 +0000 UTC" firstStartedPulling="2025-12-16 13:04:27.3539456 +0000 UTC m=+1052.781689406" lastFinishedPulling="2025-12-16 13:05:21.527441124 +0000 UTC m=+1106.955184920" observedRunningTime="2025-12-16 13:05:23.677359764 +0000 UTC m=+1109.105103560" watchObservedRunningTime="2025-12-16 13:05:23.6801362 +0000 UTC m=+1109.107879996" Dec 16 13:05:26 crc kubenswrapper[4757]: I1216 13:05:26.029187 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-gsqln" Dec 16 13:05:27 crc kubenswrapper[4757]: I1216 13:05:27.672309 4757 generic.go:334] "Generic (PLEG): container finished" podID="c2d16196-ea98-44e5-b859-bea9a8392c01" containerID="df3105c990c47a49609e9adcd8857e573f4ad6c7dcbb47cfe0aaf38e0a122a72" exitCode=0 Dec 16 13:05:27 crc kubenswrapper[4757]: I1216 13:05:27.672863 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c2d16196-ea98-44e5-b859-bea9a8392c01","Type":"ContainerDied","Data":"df3105c990c47a49609e9adcd8857e573f4ad6c7dcbb47cfe0aaf38e0a122a72"} Dec 16 13:05:27 crc kubenswrapper[4757]: I1216 13:05:27.675364 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"972a26d6-4f3b-4fc4-8e86-055dfe33652a","Type":"ContainerStarted","Data":"4dad9e9a5933a3aa52cc3e0950cb514423bd2f1483c8a95df663a36d8ab53791"} Dec 16 13:05:27 crc kubenswrapper[4757]: I1216 13:05:27.677971 4757 generic.go:334] "Generic (PLEG): container finished" podID="baba14f2-35db-422f-a583-724854b001d1" containerID="057e65bd707580ee4bd8c43b5932acd957840860bfc1eb42778610fa48c14f3b" exitCode=0 Dec 16 13:05:27 crc kubenswrapper[4757]: I1216 13:05:27.678082 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"baba14f2-35db-422f-a583-724854b001d1","Type":"ContainerDied","Data":"057e65bd707580ee4bd8c43b5932acd957840860bfc1eb42778610fa48c14f3b"} Dec 16 13:05:27 crc kubenswrapper[4757]: I1216 13:05:27.683166 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"89cc68a0-15fd-4a20-bd71-9c8acb5a92c7","Type":"ContainerStarted","Data":"3e89715bc95448f83d7a435a9ea54326245a5f7271d18f08d7c3dc9bf85562d4"} Dec 16 13:05:27 crc kubenswrapper[4757]: I1216 13:05:27.709137 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 16 13:05:27 crc kubenswrapper[4757]: I1216 13:05:27.731438 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=31.936414006 podStartE2EDuration="51.731420322s" podCreationTimestamp="2025-12-16 13:04:36 +0000 UTC" firstStartedPulling="2025-12-16 13:05:06.557832045 +0000 UTC m=+1091.985575841" lastFinishedPulling="2025-12-16 13:05:26.352838361 +0000 UTC m=+1111.780582157" observedRunningTime="2025-12-16 13:05:27.724829597 +0000 UTC m=+1113.152573393" watchObservedRunningTime="2025-12-16 13:05:27.731420322 +0000 UTC m=+1113.159164118" Dec 16 13:05:27 crc kubenswrapper[4757]: I1216 13:05:27.751415 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=29.193600169 podStartE2EDuration="49.751391213s" podCreationTimestamp="2025-12-16 13:04:38 +0000 UTC" firstStartedPulling="2025-12-16 13:05:05.815356936 +0000 UTC m=+1091.243100732" lastFinishedPulling="2025-12-16 13:05:26.37314798 +0000 UTC m=+1111.800891776" observedRunningTime="2025-12-16 13:05:27.743840825 +0000 UTC m=+1113.171584621" watchObservedRunningTime="2025-12-16 13:05:27.751391213 +0000 UTC m=+1113.179135009" Dec 16 13:05:27 crc kubenswrapper[4757]: I1216 13:05:27.783853 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 16 13:05:27 crc kubenswrapper[4757]: I1216 13:05:27.843478 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 16 13:05:28 crc kubenswrapper[4757]: I1216 13:05:28.692128 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"baba14f2-35db-422f-a583-724854b001d1","Type":"ContainerStarted","Data":"8d84920302fba752434738237eaa54fe94d74bd053adb3b0fdc2c25ca7ebc8b3"} Dec 16 13:05:28 crc kubenswrapper[4757]: I1216 13:05:28.694950 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c2d16196-ea98-44e5-b859-bea9a8392c01","Type":"ContainerStarted","Data":"fa09126e9d3099179b5aa9d772e5d7ba0c095e526b04a1f23c6456c8786e08ad"} Dec 16 13:05:28 crc kubenswrapper[4757]: I1216 13:05:28.696217 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 16 13:05:28 crc kubenswrapper[4757]: I1216 13:05:28.722184 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.715751455 podStartE2EDuration="59.722167363s" podCreationTimestamp="2025-12-16 13:04:29 +0000 UTC" firstStartedPulling="2025-12-16 13:04:31.745035411 +0000 UTC m=+1057.172779207" lastFinishedPulling="2025-12-16 13:05:11.751451319 +0000 UTC m=+1097.179195115" observedRunningTime="2025-12-16 13:05:28.716027238 +0000 UTC m=+1114.143771024" watchObservedRunningTime="2025-12-16 13:05:28.722167363 +0000 UTC m=+1114.149911159" Dec 16 13:05:28 crc kubenswrapper[4757]: I1216 13:05:28.729255 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 16 13:05:28 crc kubenswrapper[4757]: I1216 13:05:28.737188 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=11.56257273 podStartE2EDuration="1m1.737173597s" podCreationTimestamp="2025-12-16 13:04:27 +0000 UTC" firstStartedPulling="2025-12-16 13:04:29.900116029 +0000 UTC m=+1055.327859825" lastFinishedPulling="2025-12-16 13:05:20.074716896 +0000 UTC m=+1105.502460692" observedRunningTime="2025-12-16 13:05:28.735666291 +0000 UTC m=+1114.163410087" watchObservedRunningTime="2025-12-16 13:05:28.737173597 +0000 UTC m=+1114.164917393" Dec 16 13:05:28 crc kubenswrapper[4757]: I1216 13:05:28.843371 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 16 13:05:28 crc kubenswrapper[4757]: I1216 13:05:28.884320 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.025306 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-gsqln"] Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.025532 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-gsqln" podUID="ea719a31-714a-4959-8a0d-77b7a1ae769f" containerName="dnsmasq-dns" containerID="cri-o://95820b5bba46e3e4414e117d06a2c978b2ce04b17dd4ce8e9a3f2991822228db" gracePeriod=10 Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.029361 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-gsqln" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.042342 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.042379 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.059147 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-mnd57"] Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.060621 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.067681 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.099967 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-mnd57"] Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.118217 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-d654b"] Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.119422 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.123997 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.161528 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-d654b"] Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.187228 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9qxt\" (UniqueName: \"kubernetes.io/projected/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-kube-api-access-z9qxt\") pod \"dnsmasq-dns-6bc7876d45-mnd57\" (UID: \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\") " pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.187327 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-mnd57\" (UID: \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\") " pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.187376 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-config\") pod \"dnsmasq-dns-6bc7876d45-mnd57\" (UID: \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\") " pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.187390 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-mnd57\" (UID: \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\") " pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.288774 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e270a555-95f7-466f-8bb6-e76836a33d68-ovs-rundir\") pod \"ovn-controller-metrics-d654b\" (UID: \"e270a555-95f7-466f-8bb6-e76836a33d68\") " pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.288829 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-mnd57\" (UID: \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\") " pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.288860 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e270a555-95f7-466f-8bb6-e76836a33d68-config\") pod \"ovn-controller-metrics-d654b\" (UID: \"e270a555-95f7-466f-8bb6-e76836a33d68\") " pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.288883 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e270a555-95f7-466f-8bb6-e76836a33d68-ovn-rundir\") pod \"ovn-controller-metrics-d654b\" (UID: \"e270a555-95f7-466f-8bb6-e76836a33d68\") " pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.288909 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-config\") pod \"dnsmasq-dns-6bc7876d45-mnd57\" (UID: \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\") " pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.288922 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-mnd57\" (UID: \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\") " pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.288949 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e270a555-95f7-466f-8bb6-e76836a33d68-combined-ca-bundle\") pod \"ovn-controller-metrics-d654b\" (UID: \"e270a555-95f7-466f-8bb6-e76836a33d68\") " pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.288968 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e270a555-95f7-466f-8bb6-e76836a33d68-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-d654b\" (UID: \"e270a555-95f7-466f-8bb6-e76836a33d68\") " pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.288992 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9qxt\" (UniqueName: \"kubernetes.io/projected/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-kube-api-access-z9qxt\") pod \"dnsmasq-dns-6bc7876d45-mnd57\" (UID: \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\") " pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.289047 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hktxm\" (UniqueName: \"kubernetes.io/projected/e270a555-95f7-466f-8bb6-e76836a33d68-kube-api-access-hktxm\") pod \"ovn-controller-metrics-d654b\" (UID: \"e270a555-95f7-466f-8bb6-e76836a33d68\") " pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.289835 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-mnd57\" (UID: \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\") " pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.290326 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-config\") pod \"dnsmasq-dns-6bc7876d45-mnd57\" (UID: \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\") " pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.290682 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-mnd57\" (UID: \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\") " pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.309801 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9qxt\" (UniqueName: \"kubernetes.io/projected/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-kube-api-access-z9qxt\") pod \"dnsmasq-dns-6bc7876d45-mnd57\" (UID: \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\") " pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.390226 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e270a555-95f7-466f-8bb6-e76836a33d68-config\") pod \"ovn-controller-metrics-d654b\" (UID: \"e270a555-95f7-466f-8bb6-e76836a33d68\") " pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.390296 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e270a555-95f7-466f-8bb6-e76836a33d68-ovn-rundir\") pod \"ovn-controller-metrics-d654b\" (UID: \"e270a555-95f7-466f-8bb6-e76836a33d68\") " pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.390345 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e270a555-95f7-466f-8bb6-e76836a33d68-combined-ca-bundle\") pod \"ovn-controller-metrics-d654b\" (UID: \"e270a555-95f7-466f-8bb6-e76836a33d68\") " pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.390377 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e270a555-95f7-466f-8bb6-e76836a33d68-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-d654b\" (UID: \"e270a555-95f7-466f-8bb6-e76836a33d68\") " pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.390449 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hktxm\" (UniqueName: \"kubernetes.io/projected/e270a555-95f7-466f-8bb6-e76836a33d68-kube-api-access-hktxm\") pod \"ovn-controller-metrics-d654b\" (UID: \"e270a555-95f7-466f-8bb6-e76836a33d68\") " pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.390481 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e270a555-95f7-466f-8bb6-e76836a33d68-ovs-rundir\") pod \"ovn-controller-metrics-d654b\" (UID: \"e270a555-95f7-466f-8bb6-e76836a33d68\") " pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.390796 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e270a555-95f7-466f-8bb6-e76836a33d68-ovs-rundir\") pod \"ovn-controller-metrics-d654b\" (UID: \"e270a555-95f7-466f-8bb6-e76836a33d68\") " pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.390938 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.391880 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e270a555-95f7-466f-8bb6-e76836a33d68-config\") pod \"ovn-controller-metrics-d654b\" (UID: \"e270a555-95f7-466f-8bb6-e76836a33d68\") " pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.391937 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e270a555-95f7-466f-8bb6-e76836a33d68-ovn-rundir\") pod \"ovn-controller-metrics-d654b\" (UID: \"e270a555-95f7-466f-8bb6-e76836a33d68\") " pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.396687 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e270a555-95f7-466f-8bb6-e76836a33d68-combined-ca-bundle\") pod \"ovn-controller-metrics-d654b\" (UID: \"e270a555-95f7-466f-8bb6-e76836a33d68\") " pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.410567 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e270a555-95f7-466f-8bb6-e76836a33d68-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-d654b\" (UID: \"e270a555-95f7-466f-8bb6-e76836a33d68\") " pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.410617 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hktxm\" (UniqueName: \"kubernetes.io/projected/e270a555-95f7-466f-8bb6-e76836a33d68-kube-api-access-hktxm\") pod \"ovn-controller-metrics-d654b\" (UID: \"e270a555-95f7-466f-8bb6-e76836a33d68\") " pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.509894 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-d654b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.585308 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t2b69"] Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.585617 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" podUID="6aa77e81-cd27-4624-8604-684ae64ff3fb" containerName="dnsmasq-dns" containerID="cri-o://ceb873f2e6987ccceac06da817cb071eb58236fc67ec265bbc5ef864a4b83bcb" gracePeriod=10 Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.592592 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.611465 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-gsqln" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.654461 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-7dxbf"] Dec 16 13:05:29 crc kubenswrapper[4757]: E1216 13:05:29.654897 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea719a31-714a-4959-8a0d-77b7a1ae769f" containerName="dnsmasq-dns" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.654922 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea719a31-714a-4959-8a0d-77b7a1ae769f" containerName="dnsmasq-dns" Dec 16 13:05:29 crc kubenswrapper[4757]: E1216 13:05:29.654939 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea719a31-714a-4959-8a0d-77b7a1ae769f" containerName="init" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.654948 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea719a31-714a-4959-8a0d-77b7a1ae769f" containerName="init" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.655149 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea719a31-714a-4959-8a0d-77b7a1ae769f" containerName="dnsmasq-dns" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.656154 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.659417 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.665731 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-7dxbf"] Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.696614 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea719a31-714a-4959-8a0d-77b7a1ae769f-dns-svc\") pod \"ea719a31-714a-4959-8a0d-77b7a1ae769f\" (UID: \"ea719a31-714a-4959-8a0d-77b7a1ae769f\") " Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.698352 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wslfx\" (UniqueName: \"kubernetes.io/projected/ea719a31-714a-4959-8a0d-77b7a1ae769f-kube-api-access-wslfx\") pod \"ea719a31-714a-4959-8a0d-77b7a1ae769f\" (UID: \"ea719a31-714a-4959-8a0d-77b7a1ae769f\") " Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.698627 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea719a31-714a-4959-8a0d-77b7a1ae769f-config\") pod \"ea719a31-714a-4959-8a0d-77b7a1ae769f\" (UID: \"ea719a31-714a-4959-8a0d-77b7a1ae769f\") " Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.707166 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea719a31-714a-4959-8a0d-77b7a1ae769f-kube-api-access-wslfx" (OuterVolumeSpecName: "kube-api-access-wslfx") pod "ea719a31-714a-4959-8a0d-77b7a1ae769f" (UID: "ea719a31-714a-4959-8a0d-77b7a1ae769f"). InnerVolumeSpecName "kube-api-access-wslfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.747506 4757 generic.go:334] "Generic (PLEG): container finished" podID="ea719a31-714a-4959-8a0d-77b7a1ae769f" containerID="95820b5bba46e3e4414e117d06a2c978b2ce04b17dd4ce8e9a3f2991822228db" exitCode=0 Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.748862 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-gsqln" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.749424 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-gsqln" event={"ID":"ea719a31-714a-4959-8a0d-77b7a1ae769f","Type":"ContainerDied","Data":"95820b5bba46e3e4414e117d06a2c978b2ce04b17dd4ce8e9a3f2991822228db"} Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.749721 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-gsqln" event={"ID":"ea719a31-714a-4959-8a0d-77b7a1ae769f","Type":"ContainerDied","Data":"b7b5c6469f14948f0b892091d5b49e4d4fa72ccdf614015fbca20f726ccd348a"} Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.749742 4757 scope.go:117] "RemoveContainer" containerID="95820b5bba46e3e4414e117d06a2c978b2ce04b17dd4ce8e9a3f2991822228db" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.750090 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea719a31-714a-4959-8a0d-77b7a1ae769f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ea719a31-714a-4959-8a0d-77b7a1ae769f" (UID: "ea719a31-714a-4959-8a0d-77b7a1ae769f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.766196 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea719a31-714a-4959-8a0d-77b7a1ae769f-config" (OuterVolumeSpecName: "config") pod "ea719a31-714a-4959-8a0d-77b7a1ae769f" (UID: "ea719a31-714a-4959-8a0d-77b7a1ae769f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.803750 4757 scope.go:117] "RemoveContainer" containerID="59f989ed12dedade36d1b5a373f1df8081947fcd57b646091aef62874a48603b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.809874 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-config\") pod \"dnsmasq-dns-8554648995-7dxbf\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.809933 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-7dxbf\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.809957 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-dns-svc\") pod \"dnsmasq-dns-8554648995-7dxbf\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.810036 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j4kb\" (UniqueName: \"kubernetes.io/projected/ba36bade-6096-4993-96e6-9883fbc19640-kube-api-access-6j4kb\") pod \"dnsmasq-dns-8554648995-7dxbf\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.810080 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-7dxbf\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.810136 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wslfx\" (UniqueName: \"kubernetes.io/projected/ea719a31-714a-4959-8a0d-77b7a1ae769f-kube-api-access-wslfx\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.810148 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea719a31-714a-4959-8a0d-77b7a1ae769f-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.810159 4757 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea719a31-714a-4959-8a0d-77b7a1ae769f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.835442 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.843118 4757 scope.go:117] "RemoveContainer" containerID="95820b5bba46e3e4414e117d06a2c978b2ce04b17dd4ce8e9a3f2991822228db" Dec 16 13:05:29 crc kubenswrapper[4757]: E1216 13:05:29.848781 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95820b5bba46e3e4414e117d06a2c978b2ce04b17dd4ce8e9a3f2991822228db\": container with ID starting with 95820b5bba46e3e4414e117d06a2c978b2ce04b17dd4ce8e9a3f2991822228db not found: ID does not exist" containerID="95820b5bba46e3e4414e117d06a2c978b2ce04b17dd4ce8e9a3f2991822228db" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.848833 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95820b5bba46e3e4414e117d06a2c978b2ce04b17dd4ce8e9a3f2991822228db"} err="failed to get container status \"95820b5bba46e3e4414e117d06a2c978b2ce04b17dd4ce8e9a3f2991822228db\": rpc error: code = NotFound desc = could not find container \"95820b5bba46e3e4414e117d06a2c978b2ce04b17dd4ce8e9a3f2991822228db\": container with ID starting with 95820b5bba46e3e4414e117d06a2c978b2ce04b17dd4ce8e9a3f2991822228db not found: ID does not exist" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.848865 4757 scope.go:117] "RemoveContainer" containerID="59f989ed12dedade36d1b5a373f1df8081947fcd57b646091aef62874a48603b" Dec 16 13:05:29 crc kubenswrapper[4757]: E1216 13:05:29.852337 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59f989ed12dedade36d1b5a373f1df8081947fcd57b646091aef62874a48603b\": container with ID starting with 59f989ed12dedade36d1b5a373f1df8081947fcd57b646091aef62874a48603b not found: ID does not exist" containerID="59f989ed12dedade36d1b5a373f1df8081947fcd57b646091aef62874a48603b" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.852398 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f989ed12dedade36d1b5a373f1df8081947fcd57b646091aef62874a48603b"} err="failed to get container status \"59f989ed12dedade36d1b5a373f1df8081947fcd57b646091aef62874a48603b\": rpc error: code = NotFound desc = could not find container \"59f989ed12dedade36d1b5a373f1df8081947fcd57b646091aef62874a48603b\": container with ID starting with 59f989ed12dedade36d1b5a373f1df8081947fcd57b646091aef62874a48603b not found: ID does not exist" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.912228 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j4kb\" (UniqueName: \"kubernetes.io/projected/ba36bade-6096-4993-96e6-9883fbc19640-kube-api-access-6j4kb\") pod \"dnsmasq-dns-8554648995-7dxbf\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.912673 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-7dxbf\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.912975 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-config\") pod \"dnsmasq-dns-8554648995-7dxbf\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.913405 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-7dxbf\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.913551 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-dns-svc\") pod \"dnsmasq-dns-8554648995-7dxbf\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.915280 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-7dxbf\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.915907 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-config\") pod \"dnsmasq-dns-8554648995-7dxbf\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.916751 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-7dxbf\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.917109 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-dns-svc\") pod \"dnsmasq-dns-8554648995-7dxbf\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.935228 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-mnd57"] Dec 16 13:05:29 crc kubenswrapper[4757]: I1216 13:05:29.942828 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j4kb\" (UniqueName: \"kubernetes.io/projected/ba36bade-6096-4993-96e6-9883fbc19640-kube-api-access-6j4kb\") pod \"dnsmasq-dns-8554648995-7dxbf\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.028415 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.099877 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.103951 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.109596 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.110393 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.110543 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.110762 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-qjlh9" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.110928 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.155140 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-gsqln"] Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.165671 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-gsqln"] Dec 16 13:05:30 crc kubenswrapper[4757]: W1216 13:05:30.188571 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode270a555_95f7_466f_8bb6_e76836a33d68.slice/crio-5a38c2d14fb517f58d7f0620ba76aa353733762244db62fe31ddc14c46fdf91f WatchSource:0}: Error finding container 5a38c2d14fb517f58d7f0620ba76aa353733762244db62fe31ddc14c46fdf91f: Status 404 returned error can't find the container with id 5a38c2d14fb517f58d7f0620ba76aa353733762244db62fe31ddc14c46fdf91f Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.217569 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-d654b"] Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.233661 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.233720 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-config\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.233793 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mxst\" (UniqueName: \"kubernetes.io/projected/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-kube-api-access-7mxst\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.233824 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-scripts\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.233991 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.234058 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.234120 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.336856 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.337440 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-config\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.337643 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mxst\" (UniqueName: \"kubernetes.io/projected/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-kube-api-access-7mxst\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.337773 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-scripts\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.337918 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.337988 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.338190 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.338336 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.339235 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-scripts\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.339766 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-config\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.365677 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.365901 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.374797 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.389992 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mxst\" (UniqueName: \"kubernetes.io/projected/16cd2aac-d1cc-4f30-8b86-8fd811f20f88-kube-api-access-7mxst\") pod \"ovn-northd-0\" (UID: \"16cd2aac-d1cc-4f30-8b86-8fd811f20f88\") " pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.432444 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.459533 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.540135 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aa77e81-cd27-4624-8604-684ae64ff3fb-dns-svc\") pod \"6aa77e81-cd27-4624-8604-684ae64ff3fb\" (UID: \"6aa77e81-cd27-4624-8604-684ae64ff3fb\") " Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.540466 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jbgn\" (UniqueName: \"kubernetes.io/projected/6aa77e81-cd27-4624-8604-684ae64ff3fb-kube-api-access-7jbgn\") pod \"6aa77e81-cd27-4624-8604-684ae64ff3fb\" (UID: \"6aa77e81-cd27-4624-8604-684ae64ff3fb\") " Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.540512 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa77e81-cd27-4624-8604-684ae64ff3fb-config\") pod \"6aa77e81-cd27-4624-8604-684ae64ff3fb\" (UID: \"6aa77e81-cd27-4624-8604-684ae64ff3fb\") " Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.549376 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aa77e81-cd27-4624-8604-684ae64ff3fb-kube-api-access-7jbgn" (OuterVolumeSpecName: "kube-api-access-7jbgn") pod "6aa77e81-cd27-4624-8604-684ae64ff3fb" (UID: "6aa77e81-cd27-4624-8604-684ae64ff3fb"). InnerVolumeSpecName "kube-api-access-7jbgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.592684 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aa77e81-cd27-4624-8604-684ae64ff3fb-config" (OuterVolumeSpecName: "config") pod "6aa77e81-cd27-4624-8604-684ae64ff3fb" (UID: "6aa77e81-cd27-4624-8604-684ae64ff3fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.599650 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aa77e81-cd27-4624-8604-684ae64ff3fb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6aa77e81-cd27-4624-8604-684ae64ff3fb" (UID: "6aa77e81-cd27-4624-8604-684ae64ff3fb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.608247 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.642998 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jbgn\" (UniqueName: \"kubernetes.io/projected/6aa77e81-cd27-4624-8604-684ae64ff3fb-kube-api-access-7jbgn\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.646051 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa77e81-cd27-4624-8604-684ae64ff3fb-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.646068 4757 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aa77e81-cd27-4624-8604-684ae64ff3fb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.764052 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" event={"ID":"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9","Type":"ContainerStarted","Data":"0768cfd1f93bd3748d592ac222a105609742b6ec922693f6d8ed055d324a52e3"} Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.771539 4757 generic.go:334] "Generic (PLEG): container finished" podID="6aa77e81-cd27-4624-8604-684ae64ff3fb" containerID="ceb873f2e6987ccceac06da817cb071eb58236fc67ec265bbc5ef864a4b83bcb" exitCode=0 Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.771598 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" event={"ID":"6aa77e81-cd27-4624-8604-684ae64ff3fb","Type":"ContainerDied","Data":"ceb873f2e6987ccceac06da817cb071eb58236fc67ec265bbc5ef864a4b83bcb"} Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.771627 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" event={"ID":"6aa77e81-cd27-4624-8604-684ae64ff3fb","Type":"ContainerDied","Data":"1fcd4511cc86d85869f89cef54bdc9dd0f559063bf47157ba4235092b934b64d"} Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.771645 4757 scope.go:117] "RemoveContainer" containerID="ceb873f2e6987ccceac06da817cb071eb58236fc67ec265bbc5ef864a4b83bcb" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.771775 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t2b69" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.780112 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-d654b" event={"ID":"e270a555-95f7-466f-8bb6-e76836a33d68","Type":"ContainerStarted","Data":"5a38c2d14fb517f58d7f0620ba76aa353733762244db62fe31ddc14c46fdf91f"} Dec 16 13:05:30 crc kubenswrapper[4757]: W1216 13:05:30.814139 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba36bade_6096_4993_96e6_9883fbc19640.slice/crio-b11ab42b5a243ce737767eaefe8400195b0f09bfaed28f3f6567fbf53aefd22b WatchSource:0}: Error finding container b11ab42b5a243ce737767eaefe8400195b0f09bfaed28f3f6567fbf53aefd22b: Status 404 returned error can't find the container with id b11ab42b5a243ce737767eaefe8400195b0f09bfaed28f3f6567fbf53aefd22b Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.818368 4757 scope.go:117] "RemoveContainer" containerID="23bfabcc33fc2a581044ce0768ba27dd66aec0118a67ba1d0d64a53d998445f8" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.821314 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-7dxbf"] Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.841275 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t2b69"] Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.854943 4757 scope.go:117] "RemoveContainer" containerID="ceb873f2e6987ccceac06da817cb071eb58236fc67ec265bbc5ef864a4b83bcb" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.855305 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t2b69"] Dec 16 13:05:30 crc kubenswrapper[4757]: E1216 13:05:30.863309 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceb873f2e6987ccceac06da817cb071eb58236fc67ec265bbc5ef864a4b83bcb\": container with ID starting with ceb873f2e6987ccceac06da817cb071eb58236fc67ec265bbc5ef864a4b83bcb not found: ID does not exist" containerID="ceb873f2e6987ccceac06da817cb071eb58236fc67ec265bbc5ef864a4b83bcb" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.863372 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceb873f2e6987ccceac06da817cb071eb58236fc67ec265bbc5ef864a4b83bcb"} err="failed to get container status \"ceb873f2e6987ccceac06da817cb071eb58236fc67ec265bbc5ef864a4b83bcb\": rpc error: code = NotFound desc = could not find container \"ceb873f2e6987ccceac06da817cb071eb58236fc67ec265bbc5ef864a4b83bcb\": container with ID starting with ceb873f2e6987ccceac06da817cb071eb58236fc67ec265bbc5ef864a4b83bcb not found: ID does not exist" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.863407 4757 scope.go:117] "RemoveContainer" containerID="23bfabcc33fc2a581044ce0768ba27dd66aec0118a67ba1d0d64a53d998445f8" Dec 16 13:05:30 crc kubenswrapper[4757]: E1216 13:05:30.863880 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23bfabcc33fc2a581044ce0768ba27dd66aec0118a67ba1d0d64a53d998445f8\": container with ID starting with 23bfabcc33fc2a581044ce0768ba27dd66aec0118a67ba1d0d64a53d998445f8 not found: ID does not exist" containerID="23bfabcc33fc2a581044ce0768ba27dd66aec0118a67ba1d0d64a53d998445f8" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.863901 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23bfabcc33fc2a581044ce0768ba27dd66aec0118a67ba1d0d64a53d998445f8"} err="failed to get container status \"23bfabcc33fc2a581044ce0768ba27dd66aec0118a67ba1d0d64a53d998445f8\": rpc error: code = NotFound desc = could not find container \"23bfabcc33fc2a581044ce0768ba27dd66aec0118a67ba1d0d64a53d998445f8\": container with ID starting with 23bfabcc33fc2a581044ce0768ba27dd66aec0118a67ba1d0d64a53d998445f8 not found: ID does not exist" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.936891 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.936950 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.963662 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aa77e81-cd27-4624-8604-684ae64ff3fb" path="/var/lib/kubelet/pods/6aa77e81-cd27-4624-8604-684ae64ff3fb/volumes" Dec 16 13:05:30 crc kubenswrapper[4757]: I1216 13:05:30.964244 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea719a31-714a-4959-8a0d-77b7a1ae769f" path="/var/lib/kubelet/pods/ea719a31-714a-4959-8a0d-77b7a1ae769f/volumes" Dec 16 13:05:31 crc kubenswrapper[4757]: I1216 13:05:31.066809 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 13:05:31 crc kubenswrapper[4757]: I1216 13:05:31.790262 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-d654b" event={"ID":"e270a555-95f7-466f-8bb6-e76836a33d68","Type":"ContainerStarted","Data":"af75434ff9668a9e7b091997557695060e6ef0f413cf6af9bd60325616dd2a93"} Dec 16 13:05:31 crc kubenswrapper[4757]: I1216 13:05:31.792721 4757 generic.go:334] "Generic (PLEG): container finished" podID="ba36bade-6096-4993-96e6-9883fbc19640" containerID="a000461a06a73f53f3d2b70efed9742d8fc75cf060ca383d4433527d076b3c87" exitCode=0 Dec 16 13:05:31 crc kubenswrapper[4757]: I1216 13:05:31.792787 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-7dxbf" event={"ID":"ba36bade-6096-4993-96e6-9883fbc19640","Type":"ContainerDied","Data":"a000461a06a73f53f3d2b70efed9742d8fc75cf060ca383d4433527d076b3c87"} Dec 16 13:05:31 crc kubenswrapper[4757]: I1216 13:05:31.792808 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-7dxbf" event={"ID":"ba36bade-6096-4993-96e6-9883fbc19640","Type":"ContainerStarted","Data":"b11ab42b5a243ce737767eaefe8400195b0f09bfaed28f3f6567fbf53aefd22b"} Dec 16 13:05:31 crc kubenswrapper[4757]: I1216 13:05:31.794984 4757 generic.go:334] "Generic (PLEG): container finished" podID="84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9" containerID="7e47ea6ecc24456beb9d32f970a01c607f9a45e3007a6e52622ab9d4a52d07c4" exitCode=0 Dec 16 13:05:31 crc kubenswrapper[4757]: I1216 13:05:31.795439 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" event={"ID":"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9","Type":"ContainerDied","Data":"7e47ea6ecc24456beb9d32f970a01c607f9a45e3007a6e52622ab9d4a52d07c4"} Dec 16 13:05:31 crc kubenswrapper[4757]: I1216 13:05:31.798277 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"16cd2aac-d1cc-4f30-8b86-8fd811f20f88","Type":"ContainerStarted","Data":"eea89c6ceb098f0f2abd70bac70c20c6a66a269e7eec05257540349611c2d0c4"} Dec 16 13:05:31 crc kubenswrapper[4757]: I1216 13:05:31.818265 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-d654b" podStartSLOduration=2.818239712 podStartE2EDuration="2.818239712s" podCreationTimestamp="2025-12-16 13:05:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:05:31.813988632 +0000 UTC m=+1117.241732428" watchObservedRunningTime="2025-12-16 13:05:31.818239712 +0000 UTC m=+1117.245983528" Dec 16 13:05:32 crc kubenswrapper[4757]: I1216 13:05:32.821855 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-7dxbf" event={"ID":"ba36bade-6096-4993-96e6-9883fbc19640","Type":"ContainerStarted","Data":"6fa61de072bdafa962eecc396172d2da62e630892bf4ed008715257807469ab8"} Dec 16 13:05:32 crc kubenswrapper[4757]: I1216 13:05:32.822225 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:32 crc kubenswrapper[4757]: I1216 13:05:32.831055 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" event={"ID":"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9","Type":"ContainerStarted","Data":"569b03cdcec453a1f62d3c5454b46d6fdeb0c8e8e959d728a733d2e2ca5188c3"} Dec 16 13:05:32 crc kubenswrapper[4757]: I1216 13:05:32.831102 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" Dec 16 13:05:32 crc kubenswrapper[4757]: I1216 13:05:32.858931 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-mnd57"] Dec 16 13:05:32 crc kubenswrapper[4757]: I1216 13:05:32.870313 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 16 13:05:32 crc kubenswrapper[4757]: I1216 13:05:32.912680 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-7dxbf" podStartSLOduration=3.912661026 podStartE2EDuration="3.912661026s" podCreationTimestamp="2025-12-16 13:05:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:05:32.881334448 +0000 UTC m=+1118.309078244" watchObservedRunningTime="2025-12-16 13:05:32.912661026 +0000 UTC m=+1118.340404822" Dec 16 13:05:32 crc kubenswrapper[4757]: I1216 13:05:32.923060 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-g2xbp"] Dec 16 13:05:32 crc kubenswrapper[4757]: E1216 13:05:32.923392 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa77e81-cd27-4624-8604-684ae64ff3fb" containerName="dnsmasq-dns" Dec 16 13:05:32 crc kubenswrapper[4757]: I1216 13:05:32.923410 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa77e81-cd27-4624-8604-684ae64ff3fb" containerName="dnsmasq-dns" Dec 16 13:05:32 crc kubenswrapper[4757]: E1216 13:05:32.923461 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa77e81-cd27-4624-8604-684ae64ff3fb" containerName="init" Dec 16 13:05:32 crc kubenswrapper[4757]: I1216 13:05:32.923469 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa77e81-cd27-4624-8604-684ae64ff3fb" containerName="init" Dec 16 13:05:32 crc kubenswrapper[4757]: I1216 13:05:32.923620 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aa77e81-cd27-4624-8604-684ae64ff3fb" containerName="dnsmasq-dns" Dec 16 13:05:32 crc kubenswrapper[4757]: I1216 13:05:32.924397 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:32 crc kubenswrapper[4757]: I1216 13:05:32.985716 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-g2xbp"] Dec 16 13:05:32 crc kubenswrapper[4757]: I1216 13:05:32.998374 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" podStartSLOduration=3.9983515560000003 podStartE2EDuration="3.998351556s" podCreationTimestamp="2025-12-16 13:05:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:05:32.978925908 +0000 UTC m=+1118.406669704" watchObservedRunningTime="2025-12-16 13:05:32.998351556 +0000 UTC m=+1118.426095352" Dec 16 13:05:33 crc kubenswrapper[4757]: I1216 13:05:33.121042 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-g2xbp\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:33 crc kubenswrapper[4757]: I1216 13:05:33.121130 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-config\") pod \"dnsmasq-dns-b8fbc5445-g2xbp\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:33 crc kubenswrapper[4757]: I1216 13:05:33.121194 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-g2xbp\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:33 crc kubenswrapper[4757]: I1216 13:05:33.121232 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-g2xbp\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:33 crc kubenswrapper[4757]: I1216 13:05:33.121319 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4277l\" (UniqueName: \"kubernetes.io/projected/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-kube-api-access-4277l\") pod \"dnsmasq-dns-b8fbc5445-g2xbp\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:33 crc kubenswrapper[4757]: I1216 13:05:33.222963 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-config\") pod \"dnsmasq-dns-b8fbc5445-g2xbp\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:33 crc kubenswrapper[4757]: I1216 13:05:33.223516 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-g2xbp\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:33 crc kubenswrapper[4757]: I1216 13:05:33.223551 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-g2xbp\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:33 crc kubenswrapper[4757]: I1216 13:05:33.223771 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-config\") pod \"dnsmasq-dns-b8fbc5445-g2xbp\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:33 crc kubenswrapper[4757]: I1216 13:05:33.224300 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-g2xbp\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:33 crc kubenswrapper[4757]: I1216 13:05:33.224535 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-g2xbp\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:33 crc kubenswrapper[4757]: I1216 13:05:33.224666 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4277l\" (UniqueName: \"kubernetes.io/projected/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-kube-api-access-4277l\") pod \"dnsmasq-dns-b8fbc5445-g2xbp\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:33 crc kubenswrapper[4757]: I1216 13:05:33.225073 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-g2xbp\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:33 crc kubenswrapper[4757]: I1216 13:05:33.225711 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-g2xbp\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:33 crc kubenswrapper[4757]: I1216 13:05:33.259430 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4277l\" (UniqueName: \"kubernetes.io/projected/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-kube-api-access-4277l\") pod \"dnsmasq-dns-b8fbc5445-g2xbp\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:33 crc kubenswrapper[4757]: I1216 13:05:33.263130 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:33 crc kubenswrapper[4757]: I1216 13:05:33.596944 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-g2xbp"] Dec 16 13:05:33 crc kubenswrapper[4757]: I1216 13:05:33.841523 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" event={"ID":"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240","Type":"ContainerStarted","Data":"f9bffc40f3a7f1251c13f79954d2f907577a02c933cb5aebc90c6e6120fb6ce0"} Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.101174 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.106046 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.107478 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.109096 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.109783 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.110579 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-pbqwc" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.179042 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.246050 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.246305 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/510c5136-4ca0-49c9-ba30-1cafb624d71f-cache\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.246426 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.246606 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2dml\" (UniqueName: \"kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-kube-api-access-b2dml\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.246711 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/510c5136-4ca0-49c9-ba30-1cafb624d71f-lock\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.347745 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2dml\" (UniqueName: \"kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-kube-api-access-b2dml\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.347801 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/510c5136-4ca0-49c9-ba30-1cafb624d71f-lock\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.347837 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.347852 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/510c5136-4ca0-49c9-ba30-1cafb624d71f-cache\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.347898 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:34 crc kubenswrapper[4757]: E1216 13:05:34.348068 4757 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 13:05:34 crc kubenswrapper[4757]: E1216 13:05:34.348080 4757 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 13:05:34 crc kubenswrapper[4757]: E1216 13:05:34.348121 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift podName:510c5136-4ca0-49c9-ba30-1cafb624d71f nodeName:}" failed. No retries permitted until 2025-12-16 13:05:34.848106967 +0000 UTC m=+1120.275850763 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift") pod "swift-storage-0" (UID: "510c5136-4ca0-49c9-ba30-1cafb624d71f") : configmap "swift-ring-files" not found Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.348311 4757 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.348313 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/510c5136-4ca0-49c9-ba30-1cafb624d71f-lock\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.348625 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/510c5136-4ca0-49c9-ba30-1cafb624d71f-cache\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.372043 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.387088 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2dml\" (UniqueName: \"kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-kube-api-access-b2dml\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.853870 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"16cd2aac-d1cc-4f30-8b86-8fd811f20f88","Type":"ContainerStarted","Data":"4e2e0d861fee47abdbe3c7d8a06a69d6fa1106a90095b53b8bb14879a08e26b3"} Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.854348 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"16cd2aac-d1cc-4f30-8b86-8fd811f20f88","Type":"ContainerStarted","Data":"7fac8efa87de4c3951a3a633a16629db2bcb99dc3ce93eae003ad1c66d7076ad"} Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.854527 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.855483 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:34 crc kubenswrapper[4757]: E1216 13:05:34.855631 4757 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 13:05:34 crc kubenswrapper[4757]: E1216 13:05:34.855651 4757 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 13:05:34 crc kubenswrapper[4757]: E1216 13:05:34.855700 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift podName:510c5136-4ca0-49c9-ba30-1cafb624d71f nodeName:}" failed. No retries permitted until 2025-12-16 13:05:35.855679521 +0000 UTC m=+1121.283423317 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift") pod "swift-storage-0" (UID: "510c5136-4ca0-49c9-ba30-1cafb624d71f") : configmap "swift-ring-files" not found Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.859386 4757 generic.go:334] "Generic (PLEG): container finished" podID="2158bcd5-2f3c-4a1a-b8d7-b98fced4c240" containerID="e39d0d04f18a1fc35ca9ec758cd3d29ce079ff14009ffcfa422a3406edff54e5" exitCode=0 Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.859561 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" podUID="84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9" containerName="dnsmasq-dns" containerID="cri-o://569b03cdcec453a1f62d3c5454b46d6fdeb0c8e8e959d728a733d2e2ca5188c3" gracePeriod=10 Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.860311 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" event={"ID":"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240","Type":"ContainerDied","Data":"e39d0d04f18a1fc35ca9ec758cd3d29ce079ff14009ffcfa422a3406edff54e5"} Dec 16 13:05:34 crc kubenswrapper[4757]: I1216 13:05:34.872630 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.760668626 podStartE2EDuration="4.87261357s" podCreationTimestamp="2025-12-16 13:05:30 +0000 UTC" firstStartedPulling="2025-12-16 13:05:31.077677138 +0000 UTC m=+1116.505420934" lastFinishedPulling="2025-12-16 13:05:34.189622082 +0000 UTC m=+1119.617365878" observedRunningTime="2025-12-16 13:05:34.871980924 +0000 UTC m=+1120.299724720" watchObservedRunningTime="2025-12-16 13:05:34.87261357 +0000 UTC m=+1120.300357366" Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.819989 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.869337 4757 generic.go:334] "Generic (PLEG): container finished" podID="84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9" containerID="569b03cdcec453a1f62d3c5454b46d6fdeb0c8e8e959d728a733d2e2ca5188c3" exitCode=0 Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.869439 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.870510 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" event={"ID":"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9","Type":"ContainerDied","Data":"569b03cdcec453a1f62d3c5454b46d6fdeb0c8e8e959d728a733d2e2ca5188c3"} Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.870655 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-mnd57" event={"ID":"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9","Type":"ContainerDied","Data":"0768cfd1f93bd3748d592ac222a105609742b6ec922693f6d8ed055d324a52e3"} Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.870764 4757 scope.go:117] "RemoveContainer" containerID="569b03cdcec453a1f62d3c5454b46d6fdeb0c8e8e959d728a733d2e2ca5188c3" Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.870989 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:35 crc kubenswrapper[4757]: E1216 13:05:35.871279 4757 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 13:05:35 crc kubenswrapper[4757]: E1216 13:05:35.871302 4757 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 13:05:35 crc kubenswrapper[4757]: E1216 13:05:35.871346 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift podName:510c5136-4ca0-49c9-ba30-1cafb624d71f nodeName:}" failed. No retries permitted until 2025-12-16 13:05:37.871329697 +0000 UTC m=+1123.299073493 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift") pod "swift-storage-0" (UID: "510c5136-4ca0-49c9-ba30-1cafb624d71f") : configmap "swift-ring-files" not found Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.880361 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" event={"ID":"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240","Type":"ContainerStarted","Data":"77b29cac38112414de492bbc46e4122b6f17403cad344525a6855eaeec15e3c9"} Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.880439 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.911552 4757 scope.go:117] "RemoveContainer" containerID="7e47ea6ecc24456beb9d32f970a01c607f9a45e3007a6e52622ab9d4a52d07c4" Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.932711 4757 scope.go:117] "RemoveContainer" containerID="569b03cdcec453a1f62d3c5454b46d6fdeb0c8e8e959d728a733d2e2ca5188c3" Dec 16 13:05:35 crc kubenswrapper[4757]: E1216 13:05:35.933322 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"569b03cdcec453a1f62d3c5454b46d6fdeb0c8e8e959d728a733d2e2ca5188c3\": container with ID starting with 569b03cdcec453a1f62d3c5454b46d6fdeb0c8e8e959d728a733d2e2ca5188c3 not found: ID does not exist" containerID="569b03cdcec453a1f62d3c5454b46d6fdeb0c8e8e959d728a733d2e2ca5188c3" Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.933379 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"569b03cdcec453a1f62d3c5454b46d6fdeb0c8e8e959d728a733d2e2ca5188c3"} err="failed to get container status \"569b03cdcec453a1f62d3c5454b46d6fdeb0c8e8e959d728a733d2e2ca5188c3\": rpc error: code = NotFound desc = could not find container \"569b03cdcec453a1f62d3c5454b46d6fdeb0c8e8e959d728a733d2e2ca5188c3\": container with ID starting with 569b03cdcec453a1f62d3c5454b46d6fdeb0c8e8e959d728a733d2e2ca5188c3 not found: ID does not exist" Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.933415 4757 scope.go:117] "RemoveContainer" containerID="7e47ea6ecc24456beb9d32f970a01c607f9a45e3007a6e52622ab9d4a52d07c4" Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.935904 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" podStartSLOduration=3.9358876289999998 podStartE2EDuration="3.935887629s" podCreationTimestamp="2025-12-16 13:05:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:05:35.913330997 +0000 UTC m=+1121.341074793" watchObservedRunningTime="2025-12-16 13:05:35.935887629 +0000 UTC m=+1121.363631425" Dec 16 13:05:35 crc kubenswrapper[4757]: E1216 13:05:35.937080 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e47ea6ecc24456beb9d32f970a01c607f9a45e3007a6e52622ab9d4a52d07c4\": container with ID starting with 7e47ea6ecc24456beb9d32f970a01c607f9a45e3007a6e52622ab9d4a52d07c4 not found: ID does not exist" containerID="7e47ea6ecc24456beb9d32f970a01c607f9a45e3007a6e52622ab9d4a52d07c4" Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.937108 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e47ea6ecc24456beb9d32f970a01c607f9a45e3007a6e52622ab9d4a52d07c4"} err="failed to get container status \"7e47ea6ecc24456beb9d32f970a01c607f9a45e3007a6e52622ab9d4a52d07c4\": rpc error: code = NotFound desc = could not find container \"7e47ea6ecc24456beb9d32f970a01c607f9a45e3007a6e52622ab9d4a52d07c4\": container with ID starting with 7e47ea6ecc24456beb9d32f970a01c607f9a45e3007a6e52622ab9d4a52d07c4 not found: ID does not exist" Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.971620 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-config\") pod \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\" (UID: \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\") " Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.971899 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9qxt\" (UniqueName: \"kubernetes.io/projected/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-kube-api-access-z9qxt\") pod \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\" (UID: \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\") " Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.972137 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-dns-svc\") pod \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\" (UID: \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\") " Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.972296 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-ovsdbserver-sb\") pod \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\" (UID: \"84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9\") " Dec 16 13:05:35 crc kubenswrapper[4757]: I1216 13:05:35.986538 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-kube-api-access-z9qxt" (OuterVolumeSpecName: "kube-api-access-z9qxt") pod "84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9" (UID: "84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9"). InnerVolumeSpecName "kube-api-access-z9qxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:05:36 crc kubenswrapper[4757]: I1216 13:05:36.028396 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9" (UID: "84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:36 crc kubenswrapper[4757]: I1216 13:05:36.036659 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-config" (OuterVolumeSpecName: "config") pod "84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9" (UID: "84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:36 crc kubenswrapper[4757]: I1216 13:05:36.043552 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9" (UID: "84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:36 crc kubenswrapper[4757]: I1216 13:05:36.075204 4757 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:36 crc kubenswrapper[4757]: I1216 13:05:36.075662 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:36 crc kubenswrapper[4757]: I1216 13:05:36.075689 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:36 crc kubenswrapper[4757]: I1216 13:05:36.075708 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9qxt\" (UniqueName: \"kubernetes.io/projected/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9-kube-api-access-z9qxt\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:36 crc kubenswrapper[4757]: I1216 13:05:36.205061 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-mnd57"] Dec 16 13:05:36 crc kubenswrapper[4757]: I1216 13:05:36.214232 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-mnd57"] Dec 16 13:05:36 crc kubenswrapper[4757]: I1216 13:05:36.959282 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9" path="/var/lib/kubelet/pods/84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9/volumes" Dec 16 13:05:37 crc kubenswrapper[4757]: I1216 13:05:37.176501 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 16 13:05:37 crc kubenswrapper[4757]: I1216 13:05:37.249845 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 16 13:05:37 crc kubenswrapper[4757]: I1216 13:05:37.921429 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-j69vw"] Dec 16 13:05:37 crc kubenswrapper[4757]: E1216 13:05:37.922204 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9" containerName="dnsmasq-dns" Dec 16 13:05:37 crc kubenswrapper[4757]: I1216 13:05:37.922224 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9" containerName="dnsmasq-dns" Dec 16 13:05:37 crc kubenswrapper[4757]: E1216 13:05:37.922264 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9" containerName="init" Dec 16 13:05:37 crc kubenswrapper[4757]: I1216 13:05:37.922274 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9" containerName="init" Dec 16 13:05:37 crc kubenswrapper[4757]: I1216 13:05:37.922467 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="84868aff-cc4d-44e1-b9ba-c0fbb58b3aa9" containerName="dnsmasq-dns" Dec 16 13:05:37 crc kubenswrapper[4757]: I1216 13:05:37.923090 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:37 crc kubenswrapper[4757]: I1216 13:05:37.924349 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:37 crc kubenswrapper[4757]: E1216 13:05:37.924636 4757 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 13:05:37 crc kubenswrapper[4757]: E1216 13:05:37.924659 4757 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 13:05:37 crc kubenswrapper[4757]: E1216 13:05:37.924709 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift podName:510c5136-4ca0-49c9-ba30-1cafb624d71f nodeName:}" failed. No retries permitted until 2025-12-16 13:05:41.924691942 +0000 UTC m=+1127.352435758 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift") pod "swift-storage-0" (UID: "510c5136-4ca0-49c9-ba30-1cafb624d71f") : configmap "swift-ring-files" not found Dec 16 13:05:37 crc kubenswrapper[4757]: I1216 13:05:37.931319 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 16 13:05:37 crc kubenswrapper[4757]: I1216 13:05:37.931595 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 16 13:05:37 crc kubenswrapper[4757]: I1216 13:05:37.931653 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 16 13:05:37 crc kubenswrapper[4757]: I1216 13:05:37.936876 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-j69vw"] Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.025633 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff595563-ea6e-4337-8018-275c60afebfb-swiftconf\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.025706 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff595563-ea6e-4337-8018-275c60afebfb-dispersionconf\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.025759 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff595563-ea6e-4337-8018-275c60afebfb-combined-ca-bundle\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.026046 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff595563-ea6e-4337-8018-275c60afebfb-scripts\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.026154 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff595563-ea6e-4337-8018-275c60afebfb-ring-data-devices\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.026221 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff595563-ea6e-4337-8018-275c60afebfb-etc-swift\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.026309 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7tk7\" (UniqueName: \"kubernetes.io/projected/ff595563-ea6e-4337-8018-275c60afebfb-kube-api-access-w7tk7\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.127732 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff595563-ea6e-4337-8018-275c60afebfb-etc-swift\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.127857 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7tk7\" (UniqueName: \"kubernetes.io/projected/ff595563-ea6e-4337-8018-275c60afebfb-kube-api-access-w7tk7\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.127926 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff595563-ea6e-4337-8018-275c60afebfb-swiftconf\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.127969 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff595563-ea6e-4337-8018-275c60afebfb-dispersionconf\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.128036 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff595563-ea6e-4337-8018-275c60afebfb-combined-ca-bundle\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.128092 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff595563-ea6e-4337-8018-275c60afebfb-scripts\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.128124 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff595563-ea6e-4337-8018-275c60afebfb-ring-data-devices\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.128222 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff595563-ea6e-4337-8018-275c60afebfb-etc-swift\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.128854 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff595563-ea6e-4337-8018-275c60afebfb-ring-data-devices\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.129043 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff595563-ea6e-4337-8018-275c60afebfb-scripts\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.131914 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff595563-ea6e-4337-8018-275c60afebfb-swiftconf\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.132153 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff595563-ea6e-4337-8018-275c60afebfb-dispersionconf\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.132276 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff595563-ea6e-4337-8018-275c60afebfb-combined-ca-bundle\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.152968 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7tk7\" (UniqueName: \"kubernetes.io/projected/ff595563-ea6e-4337-8018-275c60afebfb-kube-api-access-w7tk7\") pod \"swift-ring-rebalance-j69vw\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.246624 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.686144 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-j69vw"] Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.912558 4757 generic.go:334] "Generic (PLEG): container finished" podID="38824624-9325-4515-ab97-157001f60385" containerID="0e85736b68697cab324c82628435878c14a0dc9fd56231019ad728633de68d34" exitCode=0 Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.912639 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38824624-9325-4515-ab97-157001f60385","Type":"ContainerDied","Data":"0e85736b68697cab324c82628435878c14a0dc9fd56231019ad728633de68d34"} Dec 16 13:05:38 crc kubenswrapper[4757]: I1216 13:05:38.913726 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j69vw" event={"ID":"ff595563-ea6e-4337-8018-275c60afebfb","Type":"ContainerStarted","Data":"1def7b327a3294f04906ff44d1f492b4c5088b52b2737d14c85c9d8b55743155"} Dec 16 13:05:39 crc kubenswrapper[4757]: I1216 13:05:39.249283 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 16 13:05:39 crc kubenswrapper[4757]: I1216 13:05:39.342198 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 16 13:05:39 crc kubenswrapper[4757]: I1216 13:05:39.925386 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38824624-9325-4515-ab97-157001f60385","Type":"ContainerStarted","Data":"0aa29fce5aa134e8e87689184651a5c33124e6e03eb9b1f6a548b2059a8fdd88"} Dec 16 13:05:39 crc kubenswrapper[4757]: I1216 13:05:39.925889 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:05:39 crc kubenswrapper[4757]: I1216 13:05:39.953187 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.630804048 podStartE2EDuration="1m13.953151859s" podCreationTimestamp="2025-12-16 13:04:26 +0000 UTC" firstStartedPulling="2025-12-16 13:04:28.504866296 +0000 UTC m=+1053.932610092" lastFinishedPulling="2025-12-16 13:05:04.827214107 +0000 UTC m=+1090.254957903" observedRunningTime="2025-12-16 13:05:39.949490633 +0000 UTC m=+1125.377234429" watchObservedRunningTime="2025-12-16 13:05:39.953151859 +0000 UTC m=+1125.380895665" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.031447 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.320499 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-gwvrz"] Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.322067 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gwvrz" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.347368 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gwvrz"] Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.366852 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3061dea3-7e30-407e-a77b-b696ae6710a1-operator-scripts\") pod \"keystone-db-create-gwvrz\" (UID: \"3061dea3-7e30-407e-a77b-b696ae6710a1\") " pod="openstack/keystone-db-create-gwvrz" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.366953 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v2t4\" (UniqueName: \"kubernetes.io/projected/3061dea3-7e30-407e-a77b-b696ae6710a1-kube-api-access-5v2t4\") pod \"keystone-db-create-gwvrz\" (UID: \"3061dea3-7e30-407e-a77b-b696ae6710a1\") " pod="openstack/keystone-db-create-gwvrz" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.375973 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-16e6-account-create-update-5mfht"] Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.381594 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-16e6-account-create-update-5mfht" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.390110 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.400254 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-16e6-account-create-update-5mfht"] Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.469419 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63cbd0d4-eb87-4ee1-a6b1-cfe327223d42-operator-scripts\") pod \"keystone-16e6-account-create-update-5mfht\" (UID: \"63cbd0d4-eb87-4ee1-a6b1-cfe327223d42\") " pod="openstack/keystone-16e6-account-create-update-5mfht" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.469510 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v2t4\" (UniqueName: \"kubernetes.io/projected/3061dea3-7e30-407e-a77b-b696ae6710a1-kube-api-access-5v2t4\") pod \"keystone-db-create-gwvrz\" (UID: \"3061dea3-7e30-407e-a77b-b696ae6710a1\") " pod="openstack/keystone-db-create-gwvrz" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.469676 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7jlk\" (UniqueName: \"kubernetes.io/projected/63cbd0d4-eb87-4ee1-a6b1-cfe327223d42-kube-api-access-c7jlk\") pod \"keystone-16e6-account-create-update-5mfht\" (UID: \"63cbd0d4-eb87-4ee1-a6b1-cfe327223d42\") " pod="openstack/keystone-16e6-account-create-update-5mfht" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.469943 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3061dea3-7e30-407e-a77b-b696ae6710a1-operator-scripts\") pod \"keystone-db-create-gwvrz\" (UID: \"3061dea3-7e30-407e-a77b-b696ae6710a1\") " pod="openstack/keystone-db-create-gwvrz" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.471125 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3061dea3-7e30-407e-a77b-b696ae6710a1-operator-scripts\") pod \"keystone-db-create-gwvrz\" (UID: \"3061dea3-7e30-407e-a77b-b696ae6710a1\") " pod="openstack/keystone-db-create-gwvrz" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.494712 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v2t4\" (UniqueName: \"kubernetes.io/projected/3061dea3-7e30-407e-a77b-b696ae6710a1-kube-api-access-5v2t4\") pod \"keystone-db-create-gwvrz\" (UID: \"3061dea3-7e30-407e-a77b-b696ae6710a1\") " pod="openstack/keystone-db-create-gwvrz" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.567899 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-s6pnm"] Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.569167 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s6pnm" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.571140 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63cbd0d4-eb87-4ee1-a6b1-cfe327223d42-operator-scripts\") pod \"keystone-16e6-account-create-update-5mfht\" (UID: \"63cbd0d4-eb87-4ee1-a6b1-cfe327223d42\") " pod="openstack/keystone-16e6-account-create-update-5mfht" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.571314 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7jlk\" (UniqueName: \"kubernetes.io/projected/63cbd0d4-eb87-4ee1-a6b1-cfe327223d42-kube-api-access-c7jlk\") pod \"keystone-16e6-account-create-update-5mfht\" (UID: \"63cbd0d4-eb87-4ee1-a6b1-cfe327223d42\") " pod="openstack/keystone-16e6-account-create-update-5mfht" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.572353 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63cbd0d4-eb87-4ee1-a6b1-cfe327223d42-operator-scripts\") pod \"keystone-16e6-account-create-update-5mfht\" (UID: \"63cbd0d4-eb87-4ee1-a6b1-cfe327223d42\") " pod="openstack/keystone-16e6-account-create-update-5mfht" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.580350 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-s6pnm"] Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.619711 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7jlk\" (UniqueName: \"kubernetes.io/projected/63cbd0d4-eb87-4ee1-a6b1-cfe327223d42-kube-api-access-c7jlk\") pod \"keystone-16e6-account-create-update-5mfht\" (UID: \"63cbd0d4-eb87-4ee1-a6b1-cfe327223d42\") " pod="openstack/keystone-16e6-account-create-update-5mfht" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.641077 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gwvrz" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.656911 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5cbb-account-create-update-d89ll"] Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.657813 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5cbb-account-create-update-d89ll" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.660482 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.671423 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5cbb-account-create-update-d89ll"] Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.672194 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b512b5d-218c-4612-8030-536b8b21ab7d-operator-scripts\") pod \"placement-db-create-s6pnm\" (UID: \"0b512b5d-218c-4612-8030-536b8b21ab7d\") " pod="openstack/placement-db-create-s6pnm" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.674248 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb682\" (UniqueName: \"kubernetes.io/projected/0b512b5d-218c-4612-8030-536b8b21ab7d-kube-api-access-kb682\") pod \"placement-db-create-s6pnm\" (UID: \"0b512b5d-218c-4612-8030-536b8b21ab7d\") " pod="openstack/placement-db-create-s6pnm" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.705104 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-16e6-account-create-update-5mfht" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.776381 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3800f971-f8d9-46cd-9b10-36095059a766-operator-scripts\") pod \"placement-5cbb-account-create-update-d89ll\" (UID: \"3800f971-f8d9-46cd-9b10-36095059a766\") " pod="openstack/placement-5cbb-account-create-update-d89ll" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.777180 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb682\" (UniqueName: \"kubernetes.io/projected/0b512b5d-218c-4612-8030-536b8b21ab7d-kube-api-access-kb682\") pod \"placement-db-create-s6pnm\" (UID: \"0b512b5d-218c-4612-8030-536b8b21ab7d\") " pod="openstack/placement-db-create-s6pnm" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.777315 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxktt\" (UniqueName: \"kubernetes.io/projected/3800f971-f8d9-46cd-9b10-36095059a766-kube-api-access-bxktt\") pod \"placement-5cbb-account-create-update-d89ll\" (UID: \"3800f971-f8d9-46cd-9b10-36095059a766\") " pod="openstack/placement-5cbb-account-create-update-d89ll" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.777441 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b512b5d-218c-4612-8030-536b8b21ab7d-operator-scripts\") pod \"placement-db-create-s6pnm\" (UID: \"0b512b5d-218c-4612-8030-536b8b21ab7d\") " pod="openstack/placement-db-create-s6pnm" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.778210 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b512b5d-218c-4612-8030-536b8b21ab7d-operator-scripts\") pod \"placement-db-create-s6pnm\" (UID: \"0b512b5d-218c-4612-8030-536b8b21ab7d\") " pod="openstack/placement-db-create-s6pnm" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.782182 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8rhc9"] Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.783493 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8rhc9" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.789883 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8rhc9"] Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.801901 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb682\" (UniqueName: \"kubernetes.io/projected/0b512b5d-218c-4612-8030-536b8b21ab7d-kube-api-access-kb682\") pod \"placement-db-create-s6pnm\" (UID: \"0b512b5d-218c-4612-8030-536b8b21ab7d\") " pod="openstack/placement-db-create-s6pnm" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.879242 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fbc2012-fd22-4292-91b2-423e7ca2f2b6-operator-scripts\") pod \"glance-db-create-8rhc9\" (UID: \"5fbc2012-fd22-4292-91b2-423e7ca2f2b6\") " pod="openstack/glance-db-create-8rhc9" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.879308 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql5wc\" (UniqueName: \"kubernetes.io/projected/5fbc2012-fd22-4292-91b2-423e7ca2f2b6-kube-api-access-ql5wc\") pod \"glance-db-create-8rhc9\" (UID: \"5fbc2012-fd22-4292-91b2-423e7ca2f2b6\") " pod="openstack/glance-db-create-8rhc9" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.879415 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3800f971-f8d9-46cd-9b10-36095059a766-operator-scripts\") pod \"placement-5cbb-account-create-update-d89ll\" (UID: \"3800f971-f8d9-46cd-9b10-36095059a766\") " pod="openstack/placement-5cbb-account-create-update-d89ll" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.879494 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxktt\" (UniqueName: \"kubernetes.io/projected/3800f971-f8d9-46cd-9b10-36095059a766-kube-api-access-bxktt\") pod \"placement-5cbb-account-create-update-d89ll\" (UID: \"3800f971-f8d9-46cd-9b10-36095059a766\") " pod="openstack/placement-5cbb-account-create-update-d89ll" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.880784 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3800f971-f8d9-46cd-9b10-36095059a766-operator-scripts\") pod \"placement-5cbb-account-create-update-d89ll\" (UID: \"3800f971-f8d9-46cd-9b10-36095059a766\") " pod="openstack/placement-5cbb-account-create-update-d89ll" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.891351 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s6pnm" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.928489 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxktt\" (UniqueName: \"kubernetes.io/projected/3800f971-f8d9-46cd-9b10-36095059a766-kube-api-access-bxktt\") pod \"placement-5cbb-account-create-update-d89ll\" (UID: \"3800f971-f8d9-46cd-9b10-36095059a766\") " pod="openstack/placement-5cbb-account-create-update-d89ll" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.974220 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5cbb-account-create-update-d89ll" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.982882 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fbc2012-fd22-4292-91b2-423e7ca2f2b6-operator-scripts\") pod \"glance-db-create-8rhc9\" (UID: \"5fbc2012-fd22-4292-91b2-423e7ca2f2b6\") " pod="openstack/glance-db-create-8rhc9" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.982948 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql5wc\" (UniqueName: \"kubernetes.io/projected/5fbc2012-fd22-4292-91b2-423e7ca2f2b6-kube-api-access-ql5wc\") pod \"glance-db-create-8rhc9\" (UID: \"5fbc2012-fd22-4292-91b2-423e7ca2f2b6\") " pod="openstack/glance-db-create-8rhc9" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.984175 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fbc2012-fd22-4292-91b2-423e7ca2f2b6-operator-scripts\") pod \"glance-db-create-8rhc9\" (UID: \"5fbc2012-fd22-4292-91b2-423e7ca2f2b6\") " pod="openstack/glance-db-create-8rhc9" Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.995200 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-e53a-account-create-update-9mz79"] Dec 16 13:05:40 crc kubenswrapper[4757]: I1216 13:05:40.996377 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e53a-account-create-update-9mz79" Dec 16 13:05:41 crc kubenswrapper[4757]: I1216 13:05:41.000714 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e53a-account-create-update-9mz79"] Dec 16 13:05:41 crc kubenswrapper[4757]: I1216 13:05:41.005921 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 16 13:05:41 crc kubenswrapper[4757]: I1216 13:05:41.023219 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql5wc\" (UniqueName: \"kubernetes.io/projected/5fbc2012-fd22-4292-91b2-423e7ca2f2b6-kube-api-access-ql5wc\") pod \"glance-db-create-8rhc9\" (UID: \"5fbc2012-fd22-4292-91b2-423e7ca2f2b6\") " pod="openstack/glance-db-create-8rhc9" Dec 16 13:05:41 crc kubenswrapper[4757]: I1216 13:05:41.084134 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s866\" (UniqueName: \"kubernetes.io/projected/1a2f7431-0cc4-4be1-ba70-36ee0333bc0d-kube-api-access-6s866\") pod \"glance-e53a-account-create-update-9mz79\" (UID: \"1a2f7431-0cc4-4be1-ba70-36ee0333bc0d\") " pod="openstack/glance-e53a-account-create-update-9mz79" Dec 16 13:05:41 crc kubenswrapper[4757]: I1216 13:05:41.084258 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a2f7431-0cc4-4be1-ba70-36ee0333bc0d-operator-scripts\") pod \"glance-e53a-account-create-update-9mz79\" (UID: \"1a2f7431-0cc4-4be1-ba70-36ee0333bc0d\") " pod="openstack/glance-e53a-account-create-update-9mz79" Dec 16 13:05:41 crc kubenswrapper[4757]: I1216 13:05:41.117292 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8rhc9" Dec 16 13:05:41 crc kubenswrapper[4757]: I1216 13:05:41.186153 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s866\" (UniqueName: \"kubernetes.io/projected/1a2f7431-0cc4-4be1-ba70-36ee0333bc0d-kube-api-access-6s866\") pod \"glance-e53a-account-create-update-9mz79\" (UID: \"1a2f7431-0cc4-4be1-ba70-36ee0333bc0d\") " pod="openstack/glance-e53a-account-create-update-9mz79" Dec 16 13:05:41 crc kubenswrapper[4757]: I1216 13:05:41.186223 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a2f7431-0cc4-4be1-ba70-36ee0333bc0d-operator-scripts\") pod \"glance-e53a-account-create-update-9mz79\" (UID: \"1a2f7431-0cc4-4be1-ba70-36ee0333bc0d\") " pod="openstack/glance-e53a-account-create-update-9mz79" Dec 16 13:05:41 crc kubenswrapper[4757]: I1216 13:05:41.186950 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a2f7431-0cc4-4be1-ba70-36ee0333bc0d-operator-scripts\") pod \"glance-e53a-account-create-update-9mz79\" (UID: \"1a2f7431-0cc4-4be1-ba70-36ee0333bc0d\") " pod="openstack/glance-e53a-account-create-update-9mz79" Dec 16 13:05:41 crc kubenswrapper[4757]: I1216 13:05:41.200805 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s866\" (UniqueName: \"kubernetes.io/projected/1a2f7431-0cc4-4be1-ba70-36ee0333bc0d-kube-api-access-6s866\") pod \"glance-e53a-account-create-update-9mz79\" (UID: \"1a2f7431-0cc4-4be1-ba70-36ee0333bc0d\") " pod="openstack/glance-e53a-account-create-update-9mz79" Dec 16 13:05:41 crc kubenswrapper[4757]: I1216 13:05:41.315942 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e53a-account-create-update-9mz79" Dec 16 13:05:42 crc kubenswrapper[4757]: I1216 13:05:42.005230 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:42 crc kubenswrapper[4757]: E1216 13:05:42.005352 4757 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 13:05:42 crc kubenswrapper[4757]: E1216 13:05:42.006427 4757 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 13:05:42 crc kubenswrapper[4757]: E1216 13:05:42.006475 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift podName:510c5136-4ca0-49c9-ba30-1cafb624d71f nodeName:}" failed. No retries permitted until 2025-12-16 13:05:50.006461103 +0000 UTC m=+1135.434204899 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift") pod "swift-storage-0" (UID: "510c5136-4ca0-49c9-ba30-1cafb624d71f") : configmap "swift-ring-files" not found Dec 16 13:05:43 crc kubenswrapper[4757]: I1216 13:05:43.265050 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:05:43 crc kubenswrapper[4757]: I1216 13:05:43.324166 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-7dxbf"] Dec 16 13:05:43 crc kubenswrapper[4757]: I1216 13:05:43.324468 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-7dxbf" podUID="ba36bade-6096-4993-96e6-9883fbc19640" containerName="dnsmasq-dns" containerID="cri-o://6fa61de072bdafa962eecc396172d2da62e630892bf4ed008715257807469ab8" gracePeriod=10 Dec 16 13:05:44 crc kubenswrapper[4757]: I1216 13:05:44.875594 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:44 crc kubenswrapper[4757]: I1216 13:05:44.990244 4757 generic.go:334] "Generic (PLEG): container finished" podID="ba36bade-6096-4993-96e6-9883fbc19640" containerID="6fa61de072bdafa962eecc396172d2da62e630892bf4ed008715257807469ab8" exitCode=0 Dec 16 13:05:44 crc kubenswrapper[4757]: I1216 13:05:44.990372 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-7dxbf" event={"ID":"ba36bade-6096-4993-96e6-9883fbc19640","Type":"ContainerDied","Data":"6fa61de072bdafa962eecc396172d2da62e630892bf4ed008715257807469ab8"} Dec 16 13:05:44 crc kubenswrapper[4757]: I1216 13:05:44.990805 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-7dxbf" event={"ID":"ba36bade-6096-4993-96e6-9883fbc19640","Type":"ContainerDied","Data":"b11ab42b5a243ce737767eaefe8400195b0f09bfaed28f3f6567fbf53aefd22b"} Dec 16 13:05:44 crc kubenswrapper[4757]: I1216 13:05:44.991267 4757 scope.go:117] "RemoveContainer" containerID="6fa61de072bdafa962eecc396172d2da62e630892bf4ed008715257807469ab8" Dec 16 13:05:44 crc kubenswrapper[4757]: I1216 13:05:44.990463 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-7dxbf" Dec 16 13:05:44 crc kubenswrapper[4757]: I1216 13:05:44.996713 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-ovsdbserver-sb\") pod \"ba36bade-6096-4993-96e6-9883fbc19640\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " Dec 16 13:05:44 crc kubenswrapper[4757]: I1216 13:05:44.996795 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-config\") pod \"ba36bade-6096-4993-96e6-9883fbc19640\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " Dec 16 13:05:44 crc kubenswrapper[4757]: I1216 13:05:44.996894 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j4kb\" (UniqueName: \"kubernetes.io/projected/ba36bade-6096-4993-96e6-9883fbc19640-kube-api-access-6j4kb\") pod \"ba36bade-6096-4993-96e6-9883fbc19640\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " Dec 16 13:05:44 crc kubenswrapper[4757]: I1216 13:05:44.996967 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-ovsdbserver-nb\") pod \"ba36bade-6096-4993-96e6-9883fbc19640\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " Dec 16 13:05:44 crc kubenswrapper[4757]: I1216 13:05:44.997036 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-dns-svc\") pod \"ba36bade-6096-4993-96e6-9883fbc19640\" (UID: \"ba36bade-6096-4993-96e6-9883fbc19640\") " Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.004965 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba36bade-6096-4993-96e6-9883fbc19640-kube-api-access-6j4kb" (OuterVolumeSpecName: "kube-api-access-6j4kb") pod "ba36bade-6096-4993-96e6-9883fbc19640" (UID: "ba36bade-6096-4993-96e6-9883fbc19640"). InnerVolumeSpecName "kube-api-access-6j4kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.064815 4757 scope.go:117] "RemoveContainer" containerID="a000461a06a73f53f3d2b70efed9742d8fc75cf060ca383d4433527d076b3c87" Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.066264 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8rhc9"] Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.081696 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ba36bade-6096-4993-96e6-9883fbc19640" (UID: "ba36bade-6096-4993-96e6-9883fbc19640"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.118379 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.118918 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j4kb\" (UniqueName: \"kubernetes.io/projected/ba36bade-6096-4993-96e6-9883fbc19640-kube-api-access-6j4kb\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.139843 4757 scope.go:117] "RemoveContainer" containerID="6fa61de072bdafa962eecc396172d2da62e630892bf4ed008715257807469ab8" Dec 16 13:05:45 crc kubenswrapper[4757]: E1216 13:05:45.144545 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fa61de072bdafa962eecc396172d2da62e630892bf4ed008715257807469ab8\": container with ID starting with 6fa61de072bdafa962eecc396172d2da62e630892bf4ed008715257807469ab8 not found: ID does not exist" containerID="6fa61de072bdafa962eecc396172d2da62e630892bf4ed008715257807469ab8" Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.144590 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa61de072bdafa962eecc396172d2da62e630892bf4ed008715257807469ab8"} err="failed to get container status \"6fa61de072bdafa962eecc396172d2da62e630892bf4ed008715257807469ab8\": rpc error: code = NotFound desc = could not find container \"6fa61de072bdafa962eecc396172d2da62e630892bf4ed008715257807469ab8\": container with ID starting with 6fa61de072bdafa962eecc396172d2da62e630892bf4ed008715257807469ab8 not found: ID does not exist" Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.144634 4757 scope.go:117] "RemoveContainer" containerID="a000461a06a73f53f3d2b70efed9742d8fc75cf060ca383d4433527d076b3c87" Dec 16 13:05:45 crc kubenswrapper[4757]: E1216 13:05:45.145018 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a000461a06a73f53f3d2b70efed9742d8fc75cf060ca383d4433527d076b3c87\": container with ID starting with a000461a06a73f53f3d2b70efed9742d8fc75cf060ca383d4433527d076b3c87 not found: ID does not exist" containerID="a000461a06a73f53f3d2b70efed9742d8fc75cf060ca383d4433527d076b3c87" Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.145064 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a000461a06a73f53f3d2b70efed9742d8fc75cf060ca383d4433527d076b3c87"} err="failed to get container status \"a000461a06a73f53f3d2b70efed9742d8fc75cf060ca383d4433527d076b3c87\": rpc error: code = NotFound desc = could not find container \"a000461a06a73f53f3d2b70efed9742d8fc75cf060ca383d4433527d076b3c87\": container with ID starting with a000461a06a73f53f3d2b70efed9742d8fc75cf060ca383d4433527d076b3c87 not found: ID does not exist" Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.201822 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba36bade-6096-4993-96e6-9883fbc19640" (UID: "ba36bade-6096-4993-96e6-9883fbc19640"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.223292 4757 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.226806 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ba36bade-6096-4993-96e6-9883fbc19640" (UID: "ba36bade-6096-4993-96e6-9883fbc19640"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.263274 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-config" (OuterVolumeSpecName: "config") pod "ba36bade-6096-4993-96e6-9883fbc19640" (UID: "ba36bade-6096-4993-96e6-9883fbc19640"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.290663 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-16e6-account-create-update-5mfht"] Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.330047 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.330329 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba36bade-6096-4993-96e6-9883fbc19640-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.415783 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gwvrz"] Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.435322 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-7dxbf"] Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.439456 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-7dxbf"] Dec 16 13:05:45 crc kubenswrapper[4757]: W1216 13:05:45.449956 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3061dea3_7e30_407e_a77b_b696ae6710a1.slice/crio-bb4a7783dd0d8c54e402871b11145fa598a70832eeb14df47199938ef2393ff3 WatchSource:0}: Error finding container bb4a7783dd0d8c54e402871b11145fa598a70832eeb14df47199938ef2393ff3: Status 404 returned error can't find the container with id bb4a7783dd0d8c54e402871b11145fa598a70832eeb14df47199938ef2393ff3 Dec 16 13:05:45 crc kubenswrapper[4757]: W1216 13:05:45.561905 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3800f971_f8d9_46cd_9b10_36095059a766.slice/crio-5ea815fbf86cf5cf03303fd7fc62f166a67ba09d940f660d75d0382ddd0e86b5 WatchSource:0}: Error finding container 5ea815fbf86cf5cf03303fd7fc62f166a67ba09d940f660d75d0382ddd0e86b5: Status 404 returned error can't find the container with id 5ea815fbf86cf5cf03303fd7fc62f166a67ba09d940f660d75d0382ddd0e86b5 Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.572497 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5cbb-account-create-update-d89ll"] Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.587554 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.599417 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e53a-account-create-update-9mz79"] Dec 16 13:05:45 crc kubenswrapper[4757]: W1216 13:05:45.612253 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a2f7431_0cc4_4be1_ba70_36ee0333bc0d.slice/crio-16beb2ac1ae92f7882a3b21d3186460bdd0650df17271d869c12a0236350057b WatchSource:0}: Error finding container 16beb2ac1ae92f7882a3b21d3186460bdd0650df17271d869c12a0236350057b: Status 404 returned error can't find the container with id 16beb2ac1ae92f7882a3b21d3186460bdd0650df17271d869c12a0236350057b Dec 16 13:05:45 crc kubenswrapper[4757]: I1216 13:05:45.747996 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-s6pnm"] Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.004656 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cbb-account-create-update-d89ll" event={"ID":"3800f971-f8d9-46cd-9b10-36095059a766","Type":"ContainerStarted","Data":"f1521b3d76b4ef6cc4257efa3cbe8cae7495e08dc0aae39be9035d94770a0208"} Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.005327 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cbb-account-create-update-d89ll" event={"ID":"3800f971-f8d9-46cd-9b10-36095059a766","Type":"ContainerStarted","Data":"5ea815fbf86cf5cf03303fd7fc62f166a67ba09d940f660d75d0382ddd0e86b5"} Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.007692 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gwvrz" event={"ID":"3061dea3-7e30-407e-a77b-b696ae6710a1","Type":"ContainerStarted","Data":"ac636fd6af82484a667c4ce5ebb86d03b641b98c47f5e8c9d104671049c90c15"} Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.007744 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gwvrz" event={"ID":"3061dea3-7e30-407e-a77b-b696ae6710a1","Type":"ContainerStarted","Data":"bb4a7783dd0d8c54e402871b11145fa598a70832eeb14df47199938ef2393ff3"} Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.009512 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s6pnm" event={"ID":"0b512b5d-218c-4612-8030-536b8b21ab7d","Type":"ContainerStarted","Data":"e94adb064bd9a63fe4055152220d672f78dd32e44d5e599c5539e0c7f0a02334"} Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.011582 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j69vw" event={"ID":"ff595563-ea6e-4337-8018-275c60afebfb","Type":"ContainerStarted","Data":"6bc56fe911c8cc2e0897d8072c2c3720a9aa3401ee2472199265b735bd3ecf3a"} Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.013369 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-16e6-account-create-update-5mfht" event={"ID":"63cbd0d4-eb87-4ee1-a6b1-cfe327223d42","Type":"ContainerStarted","Data":"c9672ed0fc0283c34eb8f447496c110a39f112477587937bcee8fff0821188b6"} Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.013393 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-16e6-account-create-update-5mfht" event={"ID":"63cbd0d4-eb87-4ee1-a6b1-cfe327223d42","Type":"ContainerStarted","Data":"cdf6199086997fba5c423842bd4d445300367310d02d9d79eb8377c40a181ee3"} Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.016303 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e53a-account-create-update-9mz79" event={"ID":"1a2f7431-0cc4-4be1-ba70-36ee0333bc0d","Type":"ContainerStarted","Data":"7fd4c00449672da9dd83012aef5cd0874985ae6bcb5c984d7c347396768c76eb"} Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.016330 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e53a-account-create-update-9mz79" event={"ID":"1a2f7431-0cc4-4be1-ba70-36ee0333bc0d","Type":"ContainerStarted","Data":"16beb2ac1ae92f7882a3b21d3186460bdd0650df17271d869c12a0236350057b"} Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.022298 4757 generic.go:334] "Generic (PLEG): container finished" podID="5fbc2012-fd22-4292-91b2-423e7ca2f2b6" containerID="b366a7ada5cd808a8900f31f40d9df9d14af61a6afc7b1fd95ff8267be6c1587" exitCode=0 Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.022425 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8rhc9" event={"ID":"5fbc2012-fd22-4292-91b2-423e7ca2f2b6","Type":"ContainerDied","Data":"b366a7ada5cd808a8900f31f40d9df9d14af61a6afc7b1fd95ff8267be6c1587"} Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.022681 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8rhc9" event={"ID":"5fbc2012-fd22-4292-91b2-423e7ca2f2b6","Type":"ContainerStarted","Data":"42dc48a59d6bc23c5bde78d1975793463d78a4e818292b559944c1fcbfe6918d"} Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.031152 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5cbb-account-create-update-d89ll" podStartSLOduration=6.031128487 podStartE2EDuration="6.031128487s" podCreationTimestamp="2025-12-16 13:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:05:46.0282683 +0000 UTC m=+1131.456012096" watchObservedRunningTime="2025-12-16 13:05:46.031128487 +0000 UTC m=+1131.458872283" Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.061393 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-e53a-account-create-update-9mz79" podStartSLOduration=6.06137106 podStartE2EDuration="6.06137106s" podCreationTimestamp="2025-12-16 13:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:05:46.059727732 +0000 UTC m=+1131.487471528" watchObservedRunningTime="2025-12-16 13:05:46.06137106 +0000 UTC m=+1131.489114856" Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.092375 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-16e6-account-create-update-5mfht" podStartSLOduration=6.09235239 podStartE2EDuration="6.09235239s" podCreationTimestamp="2025-12-16 13:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:05:46.089671847 +0000 UTC m=+1131.517415653" watchObservedRunningTime="2025-12-16 13:05:46.09235239 +0000 UTC m=+1131.520096186" Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.171890 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-gwvrz" podStartSLOduration=6.171864904 podStartE2EDuration="6.171864904s" podCreationTimestamp="2025-12-16 13:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:05:46.145752599 +0000 UTC m=+1131.573496395" watchObservedRunningTime="2025-12-16 13:05:46.171864904 +0000 UTC m=+1131.599608700" Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.173959 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-s6pnm" podStartSLOduration=6.173943533 podStartE2EDuration="6.173943533s" podCreationTimestamp="2025-12-16 13:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:05:46.16743274 +0000 UTC m=+1131.595176536" watchObservedRunningTime="2025-12-16 13:05:46.173943533 +0000 UTC m=+1131.601687329" Dec 16 13:05:46 crc kubenswrapper[4757]: I1216 13:05:46.192879 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-j69vw" podStartSLOduration=3.068348034 podStartE2EDuration="9.192859979s" podCreationTimestamp="2025-12-16 13:05:37 +0000 UTC" firstStartedPulling="2025-12-16 13:05:38.702418351 +0000 UTC m=+1124.130162147" lastFinishedPulling="2025-12-16 13:05:44.826930276 +0000 UTC m=+1130.254674092" observedRunningTime="2025-12-16 13:05:46.188728712 +0000 UTC m=+1131.616472518" watchObservedRunningTime="2025-12-16 13:05:46.192859979 +0000 UTC m=+1131.620603765" Dec 16 13:05:47 crc kubenswrapper[4757]: I1216 13:05:47.219298 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba36bade-6096-4993-96e6-9883fbc19640" path="/var/lib/kubelet/pods/ba36bade-6096-4993-96e6-9883fbc19640/volumes" Dec 16 13:05:47 crc kubenswrapper[4757]: I1216 13:05:47.260314 4757 generic.go:334] "Generic (PLEG): container finished" podID="3061dea3-7e30-407e-a77b-b696ae6710a1" containerID="ac636fd6af82484a667c4ce5ebb86d03b641b98c47f5e8c9d104671049c90c15" exitCode=0 Dec 16 13:05:47 crc kubenswrapper[4757]: I1216 13:05:47.260679 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gwvrz" event={"ID":"3061dea3-7e30-407e-a77b-b696ae6710a1","Type":"ContainerDied","Data":"ac636fd6af82484a667c4ce5ebb86d03b641b98c47f5e8c9d104671049c90c15"} Dec 16 13:05:47 crc kubenswrapper[4757]: I1216 13:05:47.282326 4757 generic.go:334] "Generic (PLEG): container finished" podID="0b512b5d-218c-4612-8030-536b8b21ab7d" containerID="f4d4a6e607675bd63518611423ac412afad093a7c3b4236d3a90d1089e4973c6" exitCode=0 Dec 16 13:05:47 crc kubenswrapper[4757]: I1216 13:05:47.282452 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s6pnm" event={"ID":"0b512b5d-218c-4612-8030-536b8b21ab7d","Type":"ContainerDied","Data":"f4d4a6e607675bd63518611423ac412afad093a7c3b4236d3a90d1089e4973c6"} Dec 16 13:05:47 crc kubenswrapper[4757]: I1216 13:05:47.293415 4757 generic.go:334] "Generic (PLEG): container finished" podID="63cbd0d4-eb87-4ee1-a6b1-cfe327223d42" containerID="c9672ed0fc0283c34eb8f447496c110a39f112477587937bcee8fff0821188b6" exitCode=0 Dec 16 13:05:47 crc kubenswrapper[4757]: I1216 13:05:47.294174 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-16e6-account-create-update-5mfht" event={"ID":"63cbd0d4-eb87-4ee1-a6b1-cfe327223d42","Type":"ContainerDied","Data":"c9672ed0fc0283c34eb8f447496c110a39f112477587937bcee8fff0821188b6"} Dec 16 13:05:47 crc kubenswrapper[4757]: I1216 13:05:47.306111 4757 generic.go:334] "Generic (PLEG): container finished" podID="1a2f7431-0cc4-4be1-ba70-36ee0333bc0d" containerID="7fd4c00449672da9dd83012aef5cd0874985ae6bcb5c984d7c347396768c76eb" exitCode=0 Dec 16 13:05:47 crc kubenswrapper[4757]: I1216 13:05:47.306234 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e53a-account-create-update-9mz79" event={"ID":"1a2f7431-0cc4-4be1-ba70-36ee0333bc0d","Type":"ContainerDied","Data":"7fd4c00449672da9dd83012aef5cd0874985ae6bcb5c984d7c347396768c76eb"} Dec 16 13:05:47 crc kubenswrapper[4757]: I1216 13:05:47.319635 4757 generic.go:334] "Generic (PLEG): container finished" podID="3800f971-f8d9-46cd-9b10-36095059a766" containerID="f1521b3d76b4ef6cc4257efa3cbe8cae7495e08dc0aae39be9035d94770a0208" exitCode=0 Dec 16 13:05:47 crc kubenswrapper[4757]: I1216 13:05:47.320426 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cbb-account-create-update-d89ll" event={"ID":"3800f971-f8d9-46cd-9b10-36095059a766","Type":"ContainerDied","Data":"f1521b3d76b4ef6cc4257efa3cbe8cae7495e08dc0aae39be9035d94770a0208"} Dec 16 13:05:47 crc kubenswrapper[4757]: I1216 13:05:47.848993 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8rhc9" Dec 16 13:05:48 crc kubenswrapper[4757]: I1216 13:05:48.007903 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fbc2012-fd22-4292-91b2-423e7ca2f2b6-operator-scripts\") pod \"5fbc2012-fd22-4292-91b2-423e7ca2f2b6\" (UID: \"5fbc2012-fd22-4292-91b2-423e7ca2f2b6\") " Dec 16 13:05:48 crc kubenswrapper[4757]: I1216 13:05:48.008098 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql5wc\" (UniqueName: \"kubernetes.io/projected/5fbc2012-fd22-4292-91b2-423e7ca2f2b6-kube-api-access-ql5wc\") pod \"5fbc2012-fd22-4292-91b2-423e7ca2f2b6\" (UID: \"5fbc2012-fd22-4292-91b2-423e7ca2f2b6\") " Dec 16 13:05:48 crc kubenswrapper[4757]: I1216 13:05:48.008641 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fbc2012-fd22-4292-91b2-423e7ca2f2b6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5fbc2012-fd22-4292-91b2-423e7ca2f2b6" (UID: "5fbc2012-fd22-4292-91b2-423e7ca2f2b6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:48 crc kubenswrapper[4757]: I1216 13:05:48.009109 4757 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fbc2012-fd22-4292-91b2-423e7ca2f2b6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:48 crc kubenswrapper[4757]: I1216 13:05:48.014237 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fbc2012-fd22-4292-91b2-423e7ca2f2b6-kube-api-access-ql5wc" (OuterVolumeSpecName: "kube-api-access-ql5wc") pod "5fbc2012-fd22-4292-91b2-423e7ca2f2b6" (UID: "5fbc2012-fd22-4292-91b2-423e7ca2f2b6"). InnerVolumeSpecName "kube-api-access-ql5wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:05:48 crc kubenswrapper[4757]: I1216 13:05:48.110920 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql5wc\" (UniqueName: \"kubernetes.io/projected/5fbc2012-fd22-4292-91b2-423e7ca2f2b6-kube-api-access-ql5wc\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:48 crc kubenswrapper[4757]: I1216 13:05:48.328077 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8rhc9" Dec 16 13:05:48 crc kubenswrapper[4757]: I1216 13:05:48.328123 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8rhc9" event={"ID":"5fbc2012-fd22-4292-91b2-423e7ca2f2b6","Type":"ContainerDied","Data":"42dc48a59d6bc23c5bde78d1975793463d78a4e818292b559944c1fcbfe6918d"} Dec 16 13:05:48 crc kubenswrapper[4757]: I1216 13:05:48.328168 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42dc48a59d6bc23c5bde78d1975793463d78a4e818292b559944c1fcbfe6918d" Dec 16 13:05:48 crc kubenswrapper[4757]: I1216 13:05:48.655019 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5cbb-account-create-update-d89ll" Dec 16 13:05:48 crc kubenswrapper[4757]: I1216 13:05:48.830616 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxktt\" (UniqueName: \"kubernetes.io/projected/3800f971-f8d9-46cd-9b10-36095059a766-kube-api-access-bxktt\") pod \"3800f971-f8d9-46cd-9b10-36095059a766\" (UID: \"3800f971-f8d9-46cd-9b10-36095059a766\") " Dec 16 13:05:48 crc kubenswrapper[4757]: I1216 13:05:48.830781 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3800f971-f8d9-46cd-9b10-36095059a766-operator-scripts\") pod \"3800f971-f8d9-46cd-9b10-36095059a766\" (UID: \"3800f971-f8d9-46cd-9b10-36095059a766\") " Dec 16 13:05:48 crc kubenswrapper[4757]: I1216 13:05:48.831749 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3800f971-f8d9-46cd-9b10-36095059a766-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3800f971-f8d9-46cd-9b10-36095059a766" (UID: "3800f971-f8d9-46cd-9b10-36095059a766"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:48 crc kubenswrapper[4757]: I1216 13:05:48.842596 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3800f971-f8d9-46cd-9b10-36095059a766-kube-api-access-bxktt" (OuterVolumeSpecName: "kube-api-access-bxktt") pod "3800f971-f8d9-46cd-9b10-36095059a766" (UID: "3800f971-f8d9-46cd-9b10-36095059a766"). InnerVolumeSpecName "kube-api-access-bxktt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:05:48 crc kubenswrapper[4757]: I1216 13:05:48.933031 4757 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3800f971-f8d9-46cd-9b10-36095059a766-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:48 crc kubenswrapper[4757]: I1216 13:05:48.933066 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxktt\" (UniqueName: \"kubernetes.io/projected/3800f971-f8d9-46cd-9b10-36095059a766-kube-api-access-bxktt\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:48 crc kubenswrapper[4757]: I1216 13:05:48.934382 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gwvrz" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.034700 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v2t4\" (UniqueName: \"kubernetes.io/projected/3061dea3-7e30-407e-a77b-b696ae6710a1-kube-api-access-5v2t4\") pod \"3061dea3-7e30-407e-a77b-b696ae6710a1\" (UID: \"3061dea3-7e30-407e-a77b-b696ae6710a1\") " Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.034737 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3061dea3-7e30-407e-a77b-b696ae6710a1-operator-scripts\") pod \"3061dea3-7e30-407e-a77b-b696ae6710a1\" (UID: \"3061dea3-7e30-407e-a77b-b696ae6710a1\") " Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.035402 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3061dea3-7e30-407e-a77b-b696ae6710a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3061dea3-7e30-407e-a77b-b696ae6710a1" (UID: "3061dea3-7e30-407e-a77b-b696ae6710a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.043603 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3061dea3-7e30-407e-a77b-b696ae6710a1-kube-api-access-5v2t4" (OuterVolumeSpecName: "kube-api-access-5v2t4") pod "3061dea3-7e30-407e-a77b-b696ae6710a1" (UID: "3061dea3-7e30-407e-a77b-b696ae6710a1"). InnerVolumeSpecName "kube-api-access-5v2t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.080385 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s6pnm" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.084846 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-16e6-account-create-update-5mfht" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.108860 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e53a-account-create-update-9mz79" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.137317 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v2t4\" (UniqueName: \"kubernetes.io/projected/3061dea3-7e30-407e-a77b-b696ae6710a1-kube-api-access-5v2t4\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.137357 4757 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3061dea3-7e30-407e-a77b-b696ae6710a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.238091 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b512b5d-218c-4612-8030-536b8b21ab7d-operator-scripts\") pod \"0b512b5d-218c-4612-8030-536b8b21ab7d\" (UID: \"0b512b5d-218c-4612-8030-536b8b21ab7d\") " Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.238216 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb682\" (UniqueName: \"kubernetes.io/projected/0b512b5d-218c-4612-8030-536b8b21ab7d-kube-api-access-kb682\") pod \"0b512b5d-218c-4612-8030-536b8b21ab7d\" (UID: \"0b512b5d-218c-4612-8030-536b8b21ab7d\") " Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.238301 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s866\" (UniqueName: \"kubernetes.io/projected/1a2f7431-0cc4-4be1-ba70-36ee0333bc0d-kube-api-access-6s866\") pod \"1a2f7431-0cc4-4be1-ba70-36ee0333bc0d\" (UID: \"1a2f7431-0cc4-4be1-ba70-36ee0333bc0d\") " Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.238364 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a2f7431-0cc4-4be1-ba70-36ee0333bc0d-operator-scripts\") pod \"1a2f7431-0cc4-4be1-ba70-36ee0333bc0d\" (UID: \"1a2f7431-0cc4-4be1-ba70-36ee0333bc0d\") " Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.238397 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63cbd0d4-eb87-4ee1-a6b1-cfe327223d42-operator-scripts\") pod \"63cbd0d4-eb87-4ee1-a6b1-cfe327223d42\" (UID: \"63cbd0d4-eb87-4ee1-a6b1-cfe327223d42\") " Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.238481 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7jlk\" (UniqueName: \"kubernetes.io/projected/63cbd0d4-eb87-4ee1-a6b1-cfe327223d42-kube-api-access-c7jlk\") pod \"63cbd0d4-eb87-4ee1-a6b1-cfe327223d42\" (UID: \"63cbd0d4-eb87-4ee1-a6b1-cfe327223d42\") " Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.238747 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b512b5d-218c-4612-8030-536b8b21ab7d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b512b5d-218c-4612-8030-536b8b21ab7d" (UID: "0b512b5d-218c-4612-8030-536b8b21ab7d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.238928 4757 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b512b5d-218c-4612-8030-536b8b21ab7d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.239475 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a2f7431-0cc4-4be1-ba70-36ee0333bc0d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1a2f7431-0cc4-4be1-ba70-36ee0333bc0d" (UID: "1a2f7431-0cc4-4be1-ba70-36ee0333bc0d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.239878 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63cbd0d4-eb87-4ee1-a6b1-cfe327223d42-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "63cbd0d4-eb87-4ee1-a6b1-cfe327223d42" (UID: "63cbd0d4-eb87-4ee1-a6b1-cfe327223d42"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.242159 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a2f7431-0cc4-4be1-ba70-36ee0333bc0d-kube-api-access-6s866" (OuterVolumeSpecName: "kube-api-access-6s866") pod "1a2f7431-0cc4-4be1-ba70-36ee0333bc0d" (UID: "1a2f7431-0cc4-4be1-ba70-36ee0333bc0d"). InnerVolumeSpecName "kube-api-access-6s866". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.242323 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63cbd0d4-eb87-4ee1-a6b1-cfe327223d42-kube-api-access-c7jlk" (OuterVolumeSpecName: "kube-api-access-c7jlk") pod "63cbd0d4-eb87-4ee1-a6b1-cfe327223d42" (UID: "63cbd0d4-eb87-4ee1-a6b1-cfe327223d42"). InnerVolumeSpecName "kube-api-access-c7jlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.242691 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b512b5d-218c-4612-8030-536b8b21ab7d-kube-api-access-kb682" (OuterVolumeSpecName: "kube-api-access-kb682") pod "0b512b5d-218c-4612-8030-536b8b21ab7d" (UID: "0b512b5d-218c-4612-8030-536b8b21ab7d"). InnerVolumeSpecName "kube-api-access-kb682". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.336733 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s6pnm" event={"ID":"0b512b5d-218c-4612-8030-536b8b21ab7d","Type":"ContainerDied","Data":"e94adb064bd9a63fe4055152220d672f78dd32e44d5e599c5539e0c7f0a02334"} Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.336774 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e94adb064bd9a63fe4055152220d672f78dd32e44d5e599c5539e0c7f0a02334" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.336826 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s6pnm" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.348333 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-16e6-account-create-update-5mfht" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.348352 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-16e6-account-create-update-5mfht" event={"ID":"63cbd0d4-eb87-4ee1-a6b1-cfe327223d42","Type":"ContainerDied","Data":"cdf6199086997fba5c423842bd4d445300367310d02d9d79eb8377c40a181ee3"} Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.348649 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdf6199086997fba5c423842bd4d445300367310d02d9d79eb8377c40a181ee3" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.348978 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s866\" (UniqueName: \"kubernetes.io/projected/1a2f7431-0cc4-4be1-ba70-36ee0333bc0d-kube-api-access-6s866\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.349026 4757 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a2f7431-0cc4-4be1-ba70-36ee0333bc0d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.349039 4757 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63cbd0d4-eb87-4ee1-a6b1-cfe327223d42-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.349051 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7jlk\" (UniqueName: \"kubernetes.io/projected/63cbd0d4-eb87-4ee1-a6b1-cfe327223d42-kube-api-access-c7jlk\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.349061 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb682\" (UniqueName: \"kubernetes.io/projected/0b512b5d-218c-4612-8030-536b8b21ab7d-kube-api-access-kb682\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.351733 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e53a-account-create-update-9mz79" event={"ID":"1a2f7431-0cc4-4be1-ba70-36ee0333bc0d","Type":"ContainerDied","Data":"16beb2ac1ae92f7882a3b21d3186460bdd0650df17271d869c12a0236350057b"} Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.351974 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16beb2ac1ae92f7882a3b21d3186460bdd0650df17271d869c12a0236350057b" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.351950 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e53a-account-create-update-9mz79" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.359430 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5cbb-account-create-update-d89ll" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.359438 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cbb-account-create-update-d89ll" event={"ID":"3800f971-f8d9-46cd-9b10-36095059a766","Type":"ContainerDied","Data":"5ea815fbf86cf5cf03303fd7fc62f166a67ba09d940f660d75d0382ddd0e86b5"} Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.359476 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ea815fbf86cf5cf03303fd7fc62f166a67ba09d940f660d75d0382ddd0e86b5" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.361817 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gwvrz" event={"ID":"3061dea3-7e30-407e-a77b-b696ae6710a1","Type":"ContainerDied","Data":"bb4a7783dd0d8c54e402871b11145fa598a70832eeb14df47199938ef2393ff3"} Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.361844 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb4a7783dd0d8c54e402871b11145fa598a70832eeb14df47199938ef2393ff3" Dec 16 13:05:49 crc kubenswrapper[4757]: I1216 13:05:49.361898 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gwvrz" Dec 16 13:05:50 crc kubenswrapper[4757]: I1216 13:05:50.061894 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:05:50 crc kubenswrapper[4757]: E1216 13:05:50.062190 4757 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 13:05:50 crc kubenswrapper[4757]: E1216 13:05:50.062561 4757 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 13:05:50 crc kubenswrapper[4757]: E1216 13:05:50.062630 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift podName:510c5136-4ca0-49c9-ba30-1cafb624d71f nodeName:}" failed. No retries permitted until 2025-12-16 13:06:06.062605243 +0000 UTC m=+1151.490349039 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift") pod "swift-storage-0" (UID: "510c5136-4ca0-49c9-ba30-1cafb624d71f") : configmap "swift-ring-files" not found Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.094993 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-54js6"] Dec 16 13:05:51 crc kubenswrapper[4757]: E1216 13:05:51.096146 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba36bade-6096-4993-96e6-9883fbc19640" containerName="init" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.096166 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba36bade-6096-4993-96e6-9883fbc19640" containerName="init" Dec 16 13:05:51 crc kubenswrapper[4757]: E1216 13:05:51.096181 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b512b5d-218c-4612-8030-536b8b21ab7d" containerName="mariadb-database-create" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.096190 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b512b5d-218c-4612-8030-536b8b21ab7d" containerName="mariadb-database-create" Dec 16 13:05:51 crc kubenswrapper[4757]: E1216 13:05:51.096206 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba36bade-6096-4993-96e6-9883fbc19640" containerName="dnsmasq-dns" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.096214 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba36bade-6096-4993-96e6-9883fbc19640" containerName="dnsmasq-dns" Dec 16 13:05:51 crc kubenswrapper[4757]: E1216 13:05:51.096229 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a2f7431-0cc4-4be1-ba70-36ee0333bc0d" containerName="mariadb-account-create-update" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.096236 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a2f7431-0cc4-4be1-ba70-36ee0333bc0d" containerName="mariadb-account-create-update" Dec 16 13:05:51 crc kubenswrapper[4757]: E1216 13:05:51.096252 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fbc2012-fd22-4292-91b2-423e7ca2f2b6" containerName="mariadb-database-create" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.096259 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fbc2012-fd22-4292-91b2-423e7ca2f2b6" containerName="mariadb-database-create" Dec 16 13:05:51 crc kubenswrapper[4757]: E1216 13:05:51.096278 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63cbd0d4-eb87-4ee1-a6b1-cfe327223d42" containerName="mariadb-account-create-update" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.096286 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="63cbd0d4-eb87-4ee1-a6b1-cfe327223d42" containerName="mariadb-account-create-update" Dec 16 13:05:51 crc kubenswrapper[4757]: E1216 13:05:51.096298 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3800f971-f8d9-46cd-9b10-36095059a766" containerName="mariadb-account-create-update" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.096306 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="3800f971-f8d9-46cd-9b10-36095059a766" containerName="mariadb-account-create-update" Dec 16 13:05:51 crc kubenswrapper[4757]: E1216 13:05:51.096326 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3061dea3-7e30-407e-a77b-b696ae6710a1" containerName="mariadb-database-create" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.096335 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="3061dea3-7e30-407e-a77b-b696ae6710a1" containerName="mariadb-database-create" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.096547 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="3061dea3-7e30-407e-a77b-b696ae6710a1" containerName="mariadb-database-create" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.096564 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b512b5d-218c-4612-8030-536b8b21ab7d" containerName="mariadb-database-create" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.096577 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba36bade-6096-4993-96e6-9883fbc19640" containerName="dnsmasq-dns" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.096592 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="3800f971-f8d9-46cd-9b10-36095059a766" containerName="mariadb-account-create-update" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.096606 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a2f7431-0cc4-4be1-ba70-36ee0333bc0d" containerName="mariadb-account-create-update" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.096620 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fbc2012-fd22-4292-91b2-423e7ca2f2b6" containerName="mariadb-database-create" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.096632 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="63cbd0d4-eb87-4ee1-a6b1-cfe327223d42" containerName="mariadb-account-create-update" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.097222 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-54js6" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.103504 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.103668 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-24n8w" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.106794 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-54js6"] Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.281866 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe9cebdb-26c9-4618-9640-5e17d5976d12-db-sync-config-data\") pod \"glance-db-sync-54js6\" (UID: \"fe9cebdb-26c9-4618-9640-5e17d5976d12\") " pod="openstack/glance-db-sync-54js6" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.281915 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe9cebdb-26c9-4618-9640-5e17d5976d12-config-data\") pod \"glance-db-sync-54js6\" (UID: \"fe9cebdb-26c9-4618-9640-5e17d5976d12\") " pod="openstack/glance-db-sync-54js6" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.281937 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftzw4\" (UniqueName: \"kubernetes.io/projected/fe9cebdb-26c9-4618-9640-5e17d5976d12-kube-api-access-ftzw4\") pod \"glance-db-sync-54js6\" (UID: \"fe9cebdb-26c9-4618-9640-5e17d5976d12\") " pod="openstack/glance-db-sync-54js6" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.281968 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe9cebdb-26c9-4618-9640-5e17d5976d12-combined-ca-bundle\") pod \"glance-db-sync-54js6\" (UID: \"fe9cebdb-26c9-4618-9640-5e17d5976d12\") " pod="openstack/glance-db-sync-54js6" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.385253 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe9cebdb-26c9-4618-9640-5e17d5976d12-db-sync-config-data\") pod \"glance-db-sync-54js6\" (UID: \"fe9cebdb-26c9-4618-9640-5e17d5976d12\") " pod="openstack/glance-db-sync-54js6" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.385300 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe9cebdb-26c9-4618-9640-5e17d5976d12-config-data\") pod \"glance-db-sync-54js6\" (UID: \"fe9cebdb-26c9-4618-9640-5e17d5976d12\") " pod="openstack/glance-db-sync-54js6" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.385319 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftzw4\" (UniqueName: \"kubernetes.io/projected/fe9cebdb-26c9-4618-9640-5e17d5976d12-kube-api-access-ftzw4\") pod \"glance-db-sync-54js6\" (UID: \"fe9cebdb-26c9-4618-9640-5e17d5976d12\") " pod="openstack/glance-db-sync-54js6" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.385348 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe9cebdb-26c9-4618-9640-5e17d5976d12-combined-ca-bundle\") pod \"glance-db-sync-54js6\" (UID: \"fe9cebdb-26c9-4618-9640-5e17d5976d12\") " pod="openstack/glance-db-sync-54js6" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.391387 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe9cebdb-26c9-4618-9640-5e17d5976d12-db-sync-config-data\") pod \"glance-db-sync-54js6\" (UID: \"fe9cebdb-26c9-4618-9640-5e17d5976d12\") " pod="openstack/glance-db-sync-54js6" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.391398 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe9cebdb-26c9-4618-9640-5e17d5976d12-config-data\") pod \"glance-db-sync-54js6\" (UID: \"fe9cebdb-26c9-4618-9640-5e17d5976d12\") " pod="openstack/glance-db-sync-54js6" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.396990 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe9cebdb-26c9-4618-9640-5e17d5976d12-combined-ca-bundle\") pod \"glance-db-sync-54js6\" (UID: \"fe9cebdb-26c9-4618-9640-5e17d5976d12\") " pod="openstack/glance-db-sync-54js6" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.404298 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftzw4\" (UniqueName: \"kubernetes.io/projected/fe9cebdb-26c9-4618-9640-5e17d5976d12-kube-api-access-ftzw4\") pod \"glance-db-sync-54js6\" (UID: \"fe9cebdb-26c9-4618-9640-5e17d5976d12\") " pod="openstack/glance-db-sync-54js6" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.417160 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-54js6" Dec 16 13:05:51 crc kubenswrapper[4757]: I1216 13:05:51.961047 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-54js6"] Dec 16 13:05:52 crc kubenswrapper[4757]: I1216 13:05:52.390712 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-54js6" event={"ID":"fe9cebdb-26c9-4618-9640-5e17d5976d12","Type":"ContainerStarted","Data":"d6ab5147ef6c1966a23300c72a184e52c4cf767e373738ba8d03d435856cb567"} Dec 16 13:05:54 crc kubenswrapper[4757]: I1216 13:05:54.423618 4757 generic.go:334] "Generic (PLEG): container finished" podID="ff595563-ea6e-4337-8018-275c60afebfb" containerID="6bc56fe911c8cc2e0897d8072c2c3720a9aa3401ee2472199265b735bd3ecf3a" exitCode=0 Dec 16 13:05:54 crc kubenswrapper[4757]: I1216 13:05:54.423726 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j69vw" event={"ID":"ff595563-ea6e-4337-8018-275c60afebfb","Type":"ContainerDied","Data":"6bc56fe911c8cc2e0897d8072c2c3720a9aa3401ee2472199265b735bd3ecf3a"} Dec 16 13:05:54 crc kubenswrapper[4757]: I1216 13:05:54.431363 4757 generic.go:334] "Generic (PLEG): container finished" podID="d0dd86b6-b617-44fa-aabc-f073e1df12ca" containerID="8719fe889cba7bf8bb1f4102e84c7999b73788e4f5119eaa39f5b154a8014058" exitCode=0 Dec 16 13:05:54 crc kubenswrapper[4757]: I1216 13:05:54.431412 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d0dd86b6-b617-44fa-aabc-f073e1df12ca","Type":"ContainerDied","Data":"8719fe889cba7bf8bb1f4102e84c7999b73788e4f5119eaa39f5b154a8014058"} Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.439596 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d0dd86b6-b617-44fa-aabc-f073e1df12ca","Type":"ContainerStarted","Data":"390e2ade2cdd0e741440d82f6842104f35b406a950aaada35ce1e48c36b7c0e7"} Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.440878 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.768184 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.787524 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371946.06727 podStartE2EDuration="1m30.787504709s" podCreationTimestamp="2025-12-16 13:04:25 +0000 UTC" firstStartedPulling="2025-12-16 13:04:28.05784534 +0000 UTC m=+1053.485589136" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:05:55.477686528 +0000 UTC m=+1140.905430324" watchObservedRunningTime="2025-12-16 13:05:55.787504709 +0000 UTC m=+1141.215248505" Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.860290 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff595563-ea6e-4337-8018-275c60afebfb-dispersionconf\") pod \"ff595563-ea6e-4337-8018-275c60afebfb\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.860346 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff595563-ea6e-4337-8018-275c60afebfb-etc-swift\") pod \"ff595563-ea6e-4337-8018-275c60afebfb\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.860394 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff595563-ea6e-4337-8018-275c60afebfb-swiftconf\") pod \"ff595563-ea6e-4337-8018-275c60afebfb\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.860465 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff595563-ea6e-4337-8018-275c60afebfb-scripts\") pod \"ff595563-ea6e-4337-8018-275c60afebfb\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.860483 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff595563-ea6e-4337-8018-275c60afebfb-ring-data-devices\") pod \"ff595563-ea6e-4337-8018-275c60afebfb\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.860531 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff595563-ea6e-4337-8018-275c60afebfb-combined-ca-bundle\") pod \"ff595563-ea6e-4337-8018-275c60afebfb\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.860579 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7tk7\" (UniqueName: \"kubernetes.io/projected/ff595563-ea6e-4337-8018-275c60afebfb-kube-api-access-w7tk7\") pod \"ff595563-ea6e-4337-8018-275c60afebfb\" (UID: \"ff595563-ea6e-4337-8018-275c60afebfb\") " Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.861173 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff595563-ea6e-4337-8018-275c60afebfb-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ff595563-ea6e-4337-8018-275c60afebfb" (UID: "ff595563-ea6e-4337-8018-275c60afebfb"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.861752 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff595563-ea6e-4337-8018-275c60afebfb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ff595563-ea6e-4337-8018-275c60afebfb" (UID: "ff595563-ea6e-4337-8018-275c60afebfb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.869470 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff595563-ea6e-4337-8018-275c60afebfb-kube-api-access-w7tk7" (OuterVolumeSpecName: "kube-api-access-w7tk7") pod "ff595563-ea6e-4337-8018-275c60afebfb" (UID: "ff595563-ea6e-4337-8018-275c60afebfb"). InnerVolumeSpecName "kube-api-access-w7tk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.871392 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff595563-ea6e-4337-8018-275c60afebfb-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ff595563-ea6e-4337-8018-275c60afebfb" (UID: "ff595563-ea6e-4337-8018-275c60afebfb"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.885131 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff595563-ea6e-4337-8018-275c60afebfb-scripts" (OuterVolumeSpecName: "scripts") pod "ff595563-ea6e-4337-8018-275c60afebfb" (UID: "ff595563-ea6e-4337-8018-275c60afebfb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.893029 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff595563-ea6e-4337-8018-275c60afebfb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff595563-ea6e-4337-8018-275c60afebfb" (UID: "ff595563-ea6e-4337-8018-275c60afebfb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.912708 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff595563-ea6e-4337-8018-275c60afebfb-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ff595563-ea6e-4337-8018-275c60afebfb" (UID: "ff595563-ea6e-4337-8018-275c60afebfb"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.963325 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff595563-ea6e-4337-8018-275c60afebfb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.963359 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7tk7\" (UniqueName: \"kubernetes.io/projected/ff595563-ea6e-4337-8018-275c60afebfb-kube-api-access-w7tk7\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.963373 4757 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff595563-ea6e-4337-8018-275c60afebfb-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.963387 4757 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff595563-ea6e-4337-8018-275c60afebfb-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.963399 4757 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff595563-ea6e-4337-8018-275c60afebfb-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.963410 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff595563-ea6e-4337-8018-275c60afebfb-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:55 crc kubenswrapper[4757]: I1216 13:05:55.963421 4757 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff595563-ea6e-4337-8018-275c60afebfb-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.004366 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-xjblp" podUID="824c8db6-764f-4062-85c5-3c0fcbe434ce" containerName="ovn-controller" probeResult="failure" output=< Dec 16 13:05:56 crc kubenswrapper[4757]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 16 13:05:56 crc kubenswrapper[4757]: > Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.074015 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.078522 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2mmpx" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.315203 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xjblp-config-slclz"] Dec 16 13:05:56 crc kubenswrapper[4757]: E1216 13:05:56.315731 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff595563-ea6e-4337-8018-275c60afebfb" containerName="swift-ring-rebalance" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.315839 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff595563-ea6e-4337-8018-275c60afebfb" containerName="swift-ring-rebalance" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.316074 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff595563-ea6e-4337-8018-275c60afebfb" containerName="swift-ring-rebalance" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.316609 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.319120 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.327751 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xjblp-config-slclz"] Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.449468 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j69vw" event={"ID":"ff595563-ea6e-4337-8018-275c60afebfb","Type":"ContainerDied","Data":"1def7b327a3294f04906ff44d1f492b4c5088b52b2737d14c85c9d8b55743155"} Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.449534 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1def7b327a3294f04906ff44d1f492b4c5088b52b2737d14c85c9d8b55743155" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.450520 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j69vw" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.469839 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ddq9\" (UniqueName: \"kubernetes.io/projected/33465a3f-2865-4a03-97bb-8049d6fecf6b-kube-api-access-4ddq9\") pod \"ovn-controller-xjblp-config-slclz\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.470161 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/33465a3f-2865-4a03-97bb-8049d6fecf6b-var-log-ovn\") pod \"ovn-controller-xjblp-config-slclz\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.470351 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/33465a3f-2865-4a03-97bb-8049d6fecf6b-var-run-ovn\") pod \"ovn-controller-xjblp-config-slclz\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.470498 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/33465a3f-2865-4a03-97bb-8049d6fecf6b-additional-scripts\") pod \"ovn-controller-xjblp-config-slclz\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.470641 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33465a3f-2865-4a03-97bb-8049d6fecf6b-scripts\") pod \"ovn-controller-xjblp-config-slclz\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.470783 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/33465a3f-2865-4a03-97bb-8049d6fecf6b-var-run\") pod \"ovn-controller-xjblp-config-slclz\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.575731 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ddq9\" (UniqueName: \"kubernetes.io/projected/33465a3f-2865-4a03-97bb-8049d6fecf6b-kube-api-access-4ddq9\") pod \"ovn-controller-xjblp-config-slclz\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.575802 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/33465a3f-2865-4a03-97bb-8049d6fecf6b-var-log-ovn\") pod \"ovn-controller-xjblp-config-slclz\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.575918 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/33465a3f-2865-4a03-97bb-8049d6fecf6b-var-run-ovn\") pod \"ovn-controller-xjblp-config-slclz\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.576142 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/33465a3f-2865-4a03-97bb-8049d6fecf6b-additional-scripts\") pod \"ovn-controller-xjblp-config-slclz\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.578314 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/33465a3f-2865-4a03-97bb-8049d6fecf6b-additional-scripts\") pod \"ovn-controller-xjblp-config-slclz\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.579281 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/33465a3f-2865-4a03-97bb-8049d6fecf6b-var-log-ovn\") pod \"ovn-controller-xjblp-config-slclz\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.579990 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33465a3f-2865-4a03-97bb-8049d6fecf6b-scripts\") pod \"ovn-controller-xjblp-config-slclz\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.580194 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/33465a3f-2865-4a03-97bb-8049d6fecf6b-var-run\") pod \"ovn-controller-xjblp-config-slclz\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.580724 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/33465a3f-2865-4a03-97bb-8049d6fecf6b-var-run-ovn\") pod \"ovn-controller-xjblp-config-slclz\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.580863 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/33465a3f-2865-4a03-97bb-8049d6fecf6b-var-run\") pod \"ovn-controller-xjblp-config-slclz\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.582648 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33465a3f-2865-4a03-97bb-8049d6fecf6b-scripts\") pod \"ovn-controller-xjblp-config-slclz\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.617271 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ddq9\" (UniqueName: \"kubernetes.io/projected/33465a3f-2865-4a03-97bb-8049d6fecf6b-kube-api-access-4ddq9\") pod \"ovn-controller-xjblp-config-slclz\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:56 crc kubenswrapper[4757]: I1216 13:05:56.639304 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:05:57 crc kubenswrapper[4757]: I1216 13:05:57.175487 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xjblp-config-slclz"] Dec 16 13:05:57 crc kubenswrapper[4757]: I1216 13:05:57.825321 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:06:01 crc kubenswrapper[4757]: I1216 13:06:01.038198 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-xjblp" podUID="824c8db6-764f-4062-85c5-3c0fcbe434ce" containerName="ovn-controller" probeResult="failure" output=< Dec 16 13:06:01 crc kubenswrapper[4757]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 16 13:06:01 crc kubenswrapper[4757]: > Dec 16 13:06:05 crc kubenswrapper[4757]: I1216 13:06:05.998393 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-xjblp" podUID="824c8db6-764f-4062-85c5-3c0fcbe434ce" containerName="ovn-controller" probeResult="failure" output=< Dec 16 13:06:05 crc kubenswrapper[4757]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 16 13:06:05 crc kubenswrapper[4757]: > Dec 16 13:06:06 crc kubenswrapper[4757]: I1216 13:06:06.075310 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:06:06 crc kubenswrapper[4757]: I1216 13:06:06.083807 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/510c5136-4ca0-49c9-ba30-1cafb624d71f-etc-swift\") pod \"swift-storage-0\" (UID: \"510c5136-4ca0-49c9-ba30-1cafb624d71f\") " pod="openstack/swift-storage-0" Dec 16 13:06:06 crc kubenswrapper[4757]: I1216 13:06:06.223355 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 16 13:06:07 crc kubenswrapper[4757]: I1216 13:06:07.279752 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="d0dd86b6-b617-44fa-aabc-f073e1df12ca" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Dec 16 13:06:09 crc kubenswrapper[4757]: W1216 13:06:09.216370 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33465a3f_2865_4a03_97bb_8049d6fecf6b.slice/crio-2d7cfb632a11b7412154e1b02421c28780f0a680fcc88aab63a021f8e76bc180 WatchSource:0}: Error finding container 2d7cfb632a11b7412154e1b02421c28780f0a680fcc88aab63a021f8e76bc180: Status 404 returned error can't find the container with id 2d7cfb632a11b7412154e1b02421c28780f0a680fcc88aab63a021f8e76bc180 Dec 16 13:06:09 crc kubenswrapper[4757]: E1216 13:06:09.268898 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 16 13:06:09 crc kubenswrapper[4757]: E1216 13:06:09.269128 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ftzw4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-54js6_openstack(fe9cebdb-26c9-4618-9640-5e17d5976d12): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:06:09 crc kubenswrapper[4757]: E1216 13:06:09.270530 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-54js6" podUID="fe9cebdb-26c9-4618-9640-5e17d5976d12" Dec 16 13:06:09 crc kubenswrapper[4757]: I1216 13:06:09.568029 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xjblp-config-slclz" event={"ID":"33465a3f-2865-4a03-97bb-8049d6fecf6b","Type":"ContainerStarted","Data":"2d7cfb632a11b7412154e1b02421c28780f0a680fcc88aab63a021f8e76bc180"} Dec 16 13:06:09 crc kubenswrapper[4757]: E1216 13:06:09.569793 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-54js6" podUID="fe9cebdb-26c9-4618-9640-5e17d5976d12" Dec 16 13:06:09 crc kubenswrapper[4757]: I1216 13:06:09.801362 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 16 13:06:10 crc kubenswrapper[4757]: I1216 13:06:10.575405 4757 generic.go:334] "Generic (PLEG): container finished" podID="33465a3f-2865-4a03-97bb-8049d6fecf6b" containerID="f5fd308ddb7510ee0e4c3b4afb280537f0a556bd7462ab07c7b6cade8869500e" exitCode=0 Dec 16 13:06:10 crc kubenswrapper[4757]: I1216 13:06:10.575466 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xjblp-config-slclz" event={"ID":"33465a3f-2865-4a03-97bb-8049d6fecf6b","Type":"ContainerDied","Data":"f5fd308ddb7510ee0e4c3b4afb280537f0a556bd7462ab07c7b6cade8869500e"} Dec 16 13:06:10 crc kubenswrapper[4757]: I1216 13:06:10.576609 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"510c5136-4ca0-49c9-ba30-1cafb624d71f","Type":"ContainerStarted","Data":"fbfb4c4c8b4907d2710881936550b08a605b61671efff97ca6f0ae08f3bf03d0"} Dec 16 13:06:10 crc kubenswrapper[4757]: I1216 13:06:10.993228 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-xjblp" Dec 16 13:06:11 crc kubenswrapper[4757]: I1216 13:06:11.585963 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"510c5136-4ca0-49c9-ba30-1cafb624d71f","Type":"ContainerStarted","Data":"d0a87134c1e90dba6d43a9e5cfdc4d2c7809aac8e4e1363a16769bb1f16e97b4"} Dec 16 13:06:11 crc kubenswrapper[4757]: I1216 13:06:11.586021 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"510c5136-4ca0-49c9-ba30-1cafb624d71f","Type":"ContainerStarted","Data":"2c1ae99f6643dade50a7c71d25c22f04da31177c0e4e408c902197b0856b55f5"} Dec 16 13:06:11 crc kubenswrapper[4757]: I1216 13:06:11.586032 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"510c5136-4ca0-49c9-ba30-1cafb624d71f","Type":"ContainerStarted","Data":"47950339c1f7a8f8b0884d17ee31318468c5e9b7c08552e820dd3531483ba0d5"} Dec 16 13:06:11 crc kubenswrapper[4757]: I1216 13:06:11.884456 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.032928 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/33465a3f-2865-4a03-97bb-8049d6fecf6b-additional-scripts\") pod \"33465a3f-2865-4a03-97bb-8049d6fecf6b\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.033069 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/33465a3f-2865-4a03-97bb-8049d6fecf6b-var-log-ovn\") pod \"33465a3f-2865-4a03-97bb-8049d6fecf6b\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.033316 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/33465a3f-2865-4a03-97bb-8049d6fecf6b-var-run-ovn\") pod \"33465a3f-2865-4a03-97bb-8049d6fecf6b\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.033427 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33465a3f-2865-4a03-97bb-8049d6fecf6b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "33465a3f-2865-4a03-97bb-8049d6fecf6b" (UID: "33465a3f-2865-4a03-97bb-8049d6fecf6b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.033436 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ddq9\" (UniqueName: \"kubernetes.io/projected/33465a3f-2865-4a03-97bb-8049d6fecf6b-kube-api-access-4ddq9\") pod \"33465a3f-2865-4a03-97bb-8049d6fecf6b\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.033465 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/33465a3f-2865-4a03-97bb-8049d6fecf6b-var-run\") pod \"33465a3f-2865-4a03-97bb-8049d6fecf6b\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.033538 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33465a3f-2865-4a03-97bb-8049d6fecf6b-scripts\") pod \"33465a3f-2865-4a03-97bb-8049d6fecf6b\" (UID: \"33465a3f-2865-4a03-97bb-8049d6fecf6b\") " Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.033581 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33465a3f-2865-4a03-97bb-8049d6fecf6b-var-run" (OuterVolumeSpecName: "var-run") pod "33465a3f-2865-4a03-97bb-8049d6fecf6b" (UID: "33465a3f-2865-4a03-97bb-8049d6fecf6b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.033820 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33465a3f-2865-4a03-97bb-8049d6fecf6b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "33465a3f-2865-4a03-97bb-8049d6fecf6b" (UID: "33465a3f-2865-4a03-97bb-8049d6fecf6b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.034331 4757 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/33465a3f-2865-4a03-97bb-8049d6fecf6b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.034364 4757 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/33465a3f-2865-4a03-97bb-8049d6fecf6b-var-run\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.034379 4757 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/33465a3f-2865-4a03-97bb-8049d6fecf6b-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.034362 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33465a3f-2865-4a03-97bb-8049d6fecf6b-scripts" (OuterVolumeSpecName: "scripts") pod "33465a3f-2865-4a03-97bb-8049d6fecf6b" (UID: "33465a3f-2865-4a03-97bb-8049d6fecf6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.034437 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33465a3f-2865-4a03-97bb-8049d6fecf6b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "33465a3f-2865-4a03-97bb-8049d6fecf6b" (UID: "33465a3f-2865-4a03-97bb-8049d6fecf6b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.051710 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33465a3f-2865-4a03-97bb-8049d6fecf6b-kube-api-access-4ddq9" (OuterVolumeSpecName: "kube-api-access-4ddq9") pod "33465a3f-2865-4a03-97bb-8049d6fecf6b" (UID: "33465a3f-2865-4a03-97bb-8049d6fecf6b"). InnerVolumeSpecName "kube-api-access-4ddq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.135576 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ddq9\" (UniqueName: \"kubernetes.io/projected/33465a3f-2865-4a03-97bb-8049d6fecf6b-kube-api-access-4ddq9\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.135921 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33465a3f-2865-4a03-97bb-8049d6fecf6b-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.135936 4757 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/33465a3f-2865-4a03-97bb-8049d6fecf6b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.595691 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"510c5136-4ca0-49c9-ba30-1cafb624d71f","Type":"ContainerStarted","Data":"a6a91be77b1d7660925cf7c4348f4f7d7a0a28fa841203cbd21ba6f9e4b2f891"} Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.597053 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xjblp-config-slclz" event={"ID":"33465a3f-2865-4a03-97bb-8049d6fecf6b","Type":"ContainerDied","Data":"2d7cfb632a11b7412154e1b02421c28780f0a680fcc88aab63a021f8e76bc180"} Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.597076 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d7cfb632a11b7412154e1b02421c28780f0a680fcc88aab63a021f8e76bc180" Dec 16 13:06:12 crc kubenswrapper[4757]: I1216 13:06:12.597138 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xjblp-config-slclz" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.050790 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-xjblp-config-slclz"] Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.059243 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-xjblp-config-slclz"] Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.346696 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xjblp-config-v675n"] Dec 16 13:06:13 crc kubenswrapper[4757]: E1216 13:06:13.347031 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33465a3f-2865-4a03-97bb-8049d6fecf6b" containerName="ovn-config" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.347046 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="33465a3f-2865-4a03-97bb-8049d6fecf6b" containerName="ovn-config" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.347198 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="33465a3f-2865-4a03-97bb-8049d6fecf6b" containerName="ovn-config" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.347692 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.357362 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.364481 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xjblp-config-v675n"] Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.371743 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mrxf\" (UniqueName: \"kubernetes.io/projected/d0e47314-c60b-48e0-a6c1-b0d955025f45-kube-api-access-4mrxf\") pod \"ovn-controller-xjblp-config-v675n\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.371790 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0e47314-c60b-48e0-a6c1-b0d955025f45-var-run-ovn\") pod \"ovn-controller-xjblp-config-v675n\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.371816 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0e47314-c60b-48e0-a6c1-b0d955025f45-scripts\") pod \"ovn-controller-xjblp-config-v675n\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.371839 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d0e47314-c60b-48e0-a6c1-b0d955025f45-var-run\") pod \"ovn-controller-xjblp-config-v675n\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.371867 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d0e47314-c60b-48e0-a6c1-b0d955025f45-var-log-ovn\") pod \"ovn-controller-xjblp-config-v675n\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.371957 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d0e47314-c60b-48e0-a6c1-b0d955025f45-additional-scripts\") pod \"ovn-controller-xjblp-config-v675n\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.473373 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d0e47314-c60b-48e0-a6c1-b0d955025f45-additional-scripts\") pod \"ovn-controller-xjblp-config-v675n\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.473651 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mrxf\" (UniqueName: \"kubernetes.io/projected/d0e47314-c60b-48e0-a6c1-b0d955025f45-kube-api-access-4mrxf\") pod \"ovn-controller-xjblp-config-v675n\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.473678 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0e47314-c60b-48e0-a6c1-b0d955025f45-var-run-ovn\") pod \"ovn-controller-xjblp-config-v675n\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.473697 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0e47314-c60b-48e0-a6c1-b0d955025f45-scripts\") pod \"ovn-controller-xjblp-config-v675n\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.473717 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d0e47314-c60b-48e0-a6c1-b0d955025f45-var-run\") pod \"ovn-controller-xjblp-config-v675n\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.473741 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d0e47314-c60b-48e0-a6c1-b0d955025f45-var-log-ovn\") pod \"ovn-controller-xjblp-config-v675n\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.473980 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d0e47314-c60b-48e0-a6c1-b0d955025f45-var-log-ovn\") pod \"ovn-controller-xjblp-config-v675n\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.474274 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d0e47314-c60b-48e0-a6c1-b0d955025f45-var-run\") pod \"ovn-controller-xjblp-config-v675n\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.474310 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0e47314-c60b-48e0-a6c1-b0d955025f45-var-run-ovn\") pod \"ovn-controller-xjblp-config-v675n\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.475886 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d0e47314-c60b-48e0-a6c1-b0d955025f45-additional-scripts\") pod \"ovn-controller-xjblp-config-v675n\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.476370 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0e47314-c60b-48e0-a6c1-b0d955025f45-scripts\") pod \"ovn-controller-xjblp-config-v675n\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.497967 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mrxf\" (UniqueName: \"kubernetes.io/projected/d0e47314-c60b-48e0-a6c1-b0d955025f45-kube-api-access-4mrxf\") pod \"ovn-controller-xjblp-config-v675n\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.609928 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"510c5136-4ca0-49c9-ba30-1cafb624d71f","Type":"ContainerStarted","Data":"9d3d54dc005eff474710a8e1f5546a00c5dcd8631a4910d07002d878d64ccfe5"} Dec 16 13:06:13 crc kubenswrapper[4757]: I1216 13:06:13.670789 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:14 crc kubenswrapper[4757]: I1216 13:06:14.227555 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xjblp-config-v675n"] Dec 16 13:06:14 crc kubenswrapper[4757]: W1216 13:06:14.232739 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0e47314_c60b_48e0_a6c1_b0d955025f45.slice/crio-8f2d7959b07131d61a9bd278d05ccbda56273e2d63d33291b8ad17ef0ce59b96 WatchSource:0}: Error finding container 8f2d7959b07131d61a9bd278d05ccbda56273e2d63d33291b8ad17ef0ce59b96: Status 404 returned error can't find the container with id 8f2d7959b07131d61a9bd278d05ccbda56273e2d63d33291b8ad17ef0ce59b96 Dec 16 13:06:14 crc kubenswrapper[4757]: I1216 13:06:14.620771 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"510c5136-4ca0-49c9-ba30-1cafb624d71f","Type":"ContainerStarted","Data":"daa77becf8118b62c8401b160cbb3b3b4e760d99dc82a841548b9958395fc65e"} Dec 16 13:06:14 crc kubenswrapper[4757]: I1216 13:06:14.620811 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"510c5136-4ca0-49c9-ba30-1cafb624d71f","Type":"ContainerStarted","Data":"68c30f8b9a7cd1db39130360dd1094bf34e17ec7e19dae5a797c74de678d103e"} Dec 16 13:06:14 crc kubenswrapper[4757]: I1216 13:06:14.620821 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"510c5136-4ca0-49c9-ba30-1cafb624d71f","Type":"ContainerStarted","Data":"329d72cc21b2d8b50b7e32d05995d97b9781dd327f099d29930e086b480e6363"} Dec 16 13:06:14 crc kubenswrapper[4757]: I1216 13:06:14.623822 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xjblp-config-v675n" event={"ID":"d0e47314-c60b-48e0-a6c1-b0d955025f45","Type":"ContainerStarted","Data":"d00c3d0699f3db5dc28702bbad2957f404f37709d9dd615c72ec6502ad88e6e0"} Dec 16 13:06:14 crc kubenswrapper[4757]: I1216 13:06:14.623857 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xjblp-config-v675n" event={"ID":"d0e47314-c60b-48e0-a6c1-b0d955025f45","Type":"ContainerStarted","Data":"8f2d7959b07131d61a9bd278d05ccbda56273e2d63d33291b8ad17ef0ce59b96"} Dec 16 13:06:14 crc kubenswrapper[4757]: I1216 13:06:14.643084 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-xjblp-config-v675n" podStartSLOduration=1.643065706 podStartE2EDuration="1.643065706s" podCreationTimestamp="2025-12-16 13:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:06:14.638890697 +0000 UTC m=+1160.066634493" watchObservedRunningTime="2025-12-16 13:06:14.643065706 +0000 UTC m=+1160.070809502" Dec 16 13:06:14 crc kubenswrapper[4757]: I1216 13:06:14.961208 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33465a3f-2865-4a03-97bb-8049d6fecf6b" path="/var/lib/kubelet/pods/33465a3f-2865-4a03-97bb-8049d6fecf6b/volumes" Dec 16 13:06:15 crc kubenswrapper[4757]: I1216 13:06:15.633914 4757 generic.go:334] "Generic (PLEG): container finished" podID="d0e47314-c60b-48e0-a6c1-b0d955025f45" containerID="d00c3d0699f3db5dc28702bbad2957f404f37709d9dd615c72ec6502ad88e6e0" exitCode=0 Dec 16 13:06:15 crc kubenswrapper[4757]: I1216 13:06:15.633995 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xjblp-config-v675n" event={"ID":"d0e47314-c60b-48e0-a6c1-b0d955025f45","Type":"ContainerDied","Data":"d00c3d0699f3db5dc28702bbad2957f404f37709d9dd615c72ec6502ad88e6e0"} Dec 16 13:06:16 crc kubenswrapper[4757]: I1216 13:06:16.648385 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"510c5136-4ca0-49c9-ba30-1cafb624d71f","Type":"ContainerStarted","Data":"b825532de8955409ae03d86330e2437d3c1690b105a639b65b6d49154a79a38c"} Dec 16 13:06:16 crc kubenswrapper[4757]: I1216 13:06:16.648791 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"510c5136-4ca0-49c9-ba30-1cafb624d71f","Type":"ContainerStarted","Data":"17e197d1a8e8c7391b8b0f2555ed4cfb76c5530bfebc22c4b2ecb02516be2c20"} Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.021771 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.065694 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0e47314-c60b-48e0-a6c1-b0d955025f45-scripts\") pod \"d0e47314-c60b-48e0-a6c1-b0d955025f45\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.065754 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d0e47314-c60b-48e0-a6c1-b0d955025f45-var-run\") pod \"d0e47314-c60b-48e0-a6c1-b0d955025f45\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.065814 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d0e47314-c60b-48e0-a6c1-b0d955025f45-additional-scripts\") pod \"d0e47314-c60b-48e0-a6c1-b0d955025f45\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.065899 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0e47314-c60b-48e0-a6c1-b0d955025f45-var-run" (OuterVolumeSpecName: "var-run") pod "d0e47314-c60b-48e0-a6c1-b0d955025f45" (UID: "d0e47314-c60b-48e0-a6c1-b0d955025f45"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.066749 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0e47314-c60b-48e0-a6c1-b0d955025f45-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "d0e47314-c60b-48e0-a6c1-b0d955025f45" (UID: "d0e47314-c60b-48e0-a6c1-b0d955025f45"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.066844 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0e47314-c60b-48e0-a6c1-b0d955025f45-var-run-ovn\") pod \"d0e47314-c60b-48e0-a6c1-b0d955025f45\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.066878 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mrxf\" (UniqueName: \"kubernetes.io/projected/d0e47314-c60b-48e0-a6c1-b0d955025f45-kube-api-access-4mrxf\") pod \"d0e47314-c60b-48e0-a6c1-b0d955025f45\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.066964 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d0e47314-c60b-48e0-a6c1-b0d955025f45-var-log-ovn\") pod \"d0e47314-c60b-48e0-a6c1-b0d955025f45\" (UID: \"d0e47314-c60b-48e0-a6c1-b0d955025f45\") " Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.066915 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0e47314-c60b-48e0-a6c1-b0d955025f45-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d0e47314-c60b-48e0-a6c1-b0d955025f45" (UID: "d0e47314-c60b-48e0-a6c1-b0d955025f45"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.067112 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0e47314-c60b-48e0-a6c1-b0d955025f45-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d0e47314-c60b-48e0-a6c1-b0d955025f45" (UID: "d0e47314-c60b-48e0-a6c1-b0d955025f45"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.067394 4757 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0e47314-c60b-48e0-a6c1-b0d955025f45-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.067410 4757 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d0e47314-c60b-48e0-a6c1-b0d955025f45-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.067420 4757 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d0e47314-c60b-48e0-a6c1-b0d955025f45-var-run\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.067430 4757 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d0e47314-c60b-48e0-a6c1-b0d955025f45-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.067819 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0e47314-c60b-48e0-a6c1-b0d955025f45-scripts" (OuterVolumeSpecName: "scripts") pod "d0e47314-c60b-48e0-a6c1-b0d955025f45" (UID: "d0e47314-c60b-48e0-a6c1-b0d955025f45"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.074492 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0e47314-c60b-48e0-a6c1-b0d955025f45-kube-api-access-4mrxf" (OuterVolumeSpecName: "kube-api-access-4mrxf") pod "d0e47314-c60b-48e0-a6c1-b0d955025f45" (UID: "d0e47314-c60b-48e0-a6c1-b0d955025f45"). InnerVolumeSpecName "kube-api-access-4mrxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.169536 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mrxf\" (UniqueName: \"kubernetes.io/projected/d0e47314-c60b-48e0-a6c1-b0d955025f45-kube-api-access-4mrxf\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.169827 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0e47314-c60b-48e0-a6c1-b0d955025f45-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.280247 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.294104 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-xjblp-config-v675n"] Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.299675 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-xjblp-config-v675n"] Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.658695 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f2d7959b07131d61a9bd278d05ccbda56273e2d63d33291b8ad17ef0ce59b96" Dec 16 13:06:17 crc kubenswrapper[4757]: I1216 13:06:17.658785 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xjblp-config-v675n" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.124421 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-ctbr2"] Dec 16 13:06:18 crc kubenswrapper[4757]: E1216 13:06:18.124776 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e47314-c60b-48e0-a6c1-b0d955025f45" containerName="ovn-config" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.124794 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e47314-c60b-48e0-a6c1-b0d955025f45" containerName="ovn-config" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.124948 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0e47314-c60b-48e0-a6c1-b0d955025f45" containerName="ovn-config" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.125429 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ctbr2" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.141473 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ctbr2"] Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.214018 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-phhth"] Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.215152 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-phhth" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.239237 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-35f9-account-create-update-8lns5"] Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.240285 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-35f9-account-create-update-8lns5" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.241538 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.245814 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-phhth"] Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.271586 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-35f9-account-create-update-8lns5"] Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.286462 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh8x2\" (UniqueName: \"kubernetes.io/projected/9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8-kube-api-access-mh8x2\") pod \"barbican-db-create-ctbr2\" (UID: \"9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8\") " pod="openstack/barbican-db-create-ctbr2" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.286526 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8-operator-scripts\") pod \"barbican-db-create-ctbr2\" (UID: \"9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8\") " pod="openstack/barbican-db-create-ctbr2" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.342979 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-e7fa-account-create-update-4hn76"] Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.343990 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e7fa-account-create-update-4hn76" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.345739 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.365588 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e7fa-account-create-update-4hn76"] Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.387817 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh8x2\" (UniqueName: \"kubernetes.io/projected/9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8-kube-api-access-mh8x2\") pod \"barbican-db-create-ctbr2\" (UID: \"9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8\") " pod="openstack/barbican-db-create-ctbr2" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.387900 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39f33d4c-7376-4b9c-bb1f-192bcefc1b77-operator-scripts\") pod \"cinder-35f9-account-create-update-8lns5\" (UID: \"39f33d4c-7376-4b9c-bb1f-192bcefc1b77\") " pod="openstack/cinder-35f9-account-create-update-8lns5" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.387942 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sgqn\" (UniqueName: \"kubernetes.io/projected/7f76f2d4-df5a-46a3-8a3f-7c326a48b045-kube-api-access-8sgqn\") pod \"cinder-db-create-phhth\" (UID: \"7f76f2d4-df5a-46a3-8a3f-7c326a48b045\") " pod="openstack/cinder-db-create-phhth" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.387964 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8-operator-scripts\") pod \"barbican-db-create-ctbr2\" (UID: \"9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8\") " pod="openstack/barbican-db-create-ctbr2" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.387991 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f76f2d4-df5a-46a3-8a3f-7c326a48b045-operator-scripts\") pod \"cinder-db-create-phhth\" (UID: \"7f76f2d4-df5a-46a3-8a3f-7c326a48b045\") " pod="openstack/cinder-db-create-phhth" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.388092 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9lgq\" (UniqueName: \"kubernetes.io/projected/39f33d4c-7376-4b9c-bb1f-192bcefc1b77-kube-api-access-r9lgq\") pod \"cinder-35f9-account-create-update-8lns5\" (UID: \"39f33d4c-7376-4b9c-bb1f-192bcefc1b77\") " pod="openstack/cinder-35f9-account-create-update-8lns5" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.389402 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8-operator-scripts\") pod \"barbican-db-create-ctbr2\" (UID: \"9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8\") " pod="openstack/barbican-db-create-ctbr2" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.436765 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh8x2\" (UniqueName: \"kubernetes.io/projected/9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8-kube-api-access-mh8x2\") pod \"barbican-db-create-ctbr2\" (UID: \"9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8\") " pod="openstack/barbican-db-create-ctbr2" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.439854 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ctbr2" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.489357 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39f33d4c-7376-4b9c-bb1f-192bcefc1b77-operator-scripts\") pod \"cinder-35f9-account-create-update-8lns5\" (UID: \"39f33d4c-7376-4b9c-bb1f-192bcefc1b77\") " pod="openstack/cinder-35f9-account-create-update-8lns5" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.489407 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/292c184a-83d5-46a5-a309-f72088414fe7-operator-scripts\") pod \"barbican-e7fa-account-create-update-4hn76\" (UID: \"292c184a-83d5-46a5-a309-f72088414fe7\") " pod="openstack/barbican-e7fa-account-create-update-4hn76" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.489461 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sgqn\" (UniqueName: \"kubernetes.io/projected/7f76f2d4-df5a-46a3-8a3f-7c326a48b045-kube-api-access-8sgqn\") pod \"cinder-db-create-phhth\" (UID: \"7f76f2d4-df5a-46a3-8a3f-7c326a48b045\") " pod="openstack/cinder-db-create-phhth" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.489508 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f76f2d4-df5a-46a3-8a3f-7c326a48b045-operator-scripts\") pod \"cinder-db-create-phhth\" (UID: \"7f76f2d4-df5a-46a3-8a3f-7c326a48b045\") " pod="openstack/cinder-db-create-phhth" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.489554 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnk5m\" (UniqueName: \"kubernetes.io/projected/292c184a-83d5-46a5-a309-f72088414fe7-kube-api-access-nnk5m\") pod \"barbican-e7fa-account-create-update-4hn76\" (UID: \"292c184a-83d5-46a5-a309-f72088414fe7\") " pod="openstack/barbican-e7fa-account-create-update-4hn76" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.489613 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9lgq\" (UniqueName: \"kubernetes.io/projected/39f33d4c-7376-4b9c-bb1f-192bcefc1b77-kube-api-access-r9lgq\") pod \"cinder-35f9-account-create-update-8lns5\" (UID: \"39f33d4c-7376-4b9c-bb1f-192bcefc1b77\") " pod="openstack/cinder-35f9-account-create-update-8lns5" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.490584 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f76f2d4-df5a-46a3-8a3f-7c326a48b045-operator-scripts\") pod \"cinder-db-create-phhth\" (UID: \"7f76f2d4-df5a-46a3-8a3f-7c326a48b045\") " pod="openstack/cinder-db-create-phhth" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.490591 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39f33d4c-7376-4b9c-bb1f-192bcefc1b77-operator-scripts\") pod \"cinder-35f9-account-create-update-8lns5\" (UID: \"39f33d4c-7376-4b9c-bb1f-192bcefc1b77\") " pod="openstack/cinder-35f9-account-create-update-8lns5" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.510869 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-j8dhl"] Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.512141 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-j8dhl" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.535054 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rtqbc" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.535615 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.535824 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.538001 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.539669 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-j8dhl"] Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.540912 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sgqn\" (UniqueName: \"kubernetes.io/projected/7f76f2d4-df5a-46a3-8a3f-7c326a48b045-kube-api-access-8sgqn\") pod \"cinder-db-create-phhth\" (UID: \"7f76f2d4-df5a-46a3-8a3f-7c326a48b045\") " pod="openstack/cinder-db-create-phhth" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.551664 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9lgq\" (UniqueName: \"kubernetes.io/projected/39f33d4c-7376-4b9c-bb1f-192bcefc1b77-kube-api-access-r9lgq\") pod \"cinder-35f9-account-create-update-8lns5\" (UID: \"39f33d4c-7376-4b9c-bb1f-192bcefc1b77\") " pod="openstack/cinder-35f9-account-create-update-8lns5" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.554402 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-35f9-account-create-update-8lns5" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.590843 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/292c184a-83d5-46a5-a309-f72088414fe7-operator-scripts\") pod \"barbican-e7fa-account-create-update-4hn76\" (UID: \"292c184a-83d5-46a5-a309-f72088414fe7\") " pod="openstack/barbican-e7fa-account-create-update-4hn76" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.591146 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnk5m\" (UniqueName: \"kubernetes.io/projected/292c184a-83d5-46a5-a309-f72088414fe7-kube-api-access-nnk5m\") pod \"barbican-e7fa-account-create-update-4hn76\" (UID: \"292c184a-83d5-46a5-a309-f72088414fe7\") " pod="openstack/barbican-e7fa-account-create-update-4hn76" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.592058 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/292c184a-83d5-46a5-a309-f72088414fe7-operator-scripts\") pod \"barbican-e7fa-account-create-update-4hn76\" (UID: \"292c184a-83d5-46a5-a309-f72088414fe7\") " pod="openstack/barbican-e7fa-account-create-update-4hn76" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.620661 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnk5m\" (UniqueName: \"kubernetes.io/projected/292c184a-83d5-46a5-a309-f72088414fe7-kube-api-access-nnk5m\") pod \"barbican-e7fa-account-create-update-4hn76\" (UID: \"292c184a-83d5-46a5-a309-f72088414fe7\") " pod="openstack/barbican-e7fa-account-create-update-4hn76" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.676357 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e7fa-account-create-update-4hn76" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.687913 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-9l9bq"] Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.697617 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9l9bq" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.693537 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a76784-332d-479e-9cce-1f5acb1e828f-config-data\") pod \"keystone-db-sync-j8dhl\" (UID: \"b5a76784-332d-479e-9cce-1f5acb1e828f\") " pod="openstack/keystone-db-sync-j8dhl" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.727970 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a76784-332d-479e-9cce-1f5acb1e828f-combined-ca-bundle\") pod \"keystone-db-sync-j8dhl\" (UID: \"b5a76784-332d-479e-9cce-1f5acb1e828f\") " pod="openstack/keystone-db-sync-j8dhl" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.728235 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22z7q\" (UniqueName: \"kubernetes.io/projected/b5a76784-332d-479e-9cce-1f5acb1e828f-kube-api-access-22z7q\") pod \"keystone-db-sync-j8dhl\" (UID: \"b5a76784-332d-479e-9cce-1f5acb1e828f\") " pod="openstack/keystone-db-sync-j8dhl" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.700794 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-44ee-account-create-update-9ql9v"] Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.766892 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-9l9bq"] Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.767028 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-44ee-account-create-update-9ql9v" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.769686 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-44ee-account-create-update-9ql9v"] Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.772201 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.785558 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"510c5136-4ca0-49c9-ba30-1cafb624d71f","Type":"ContainerStarted","Data":"33f56bc887759d3128167903b4b56e5100b9d5cfe0457d4e961316113b28fdd9"} Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.827558 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-phhth" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.829715 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h89j\" (UniqueName: \"kubernetes.io/projected/c1627bec-94b0-4d6b-bbd4-178fa53884ff-kube-api-access-7h89j\") pod \"neutron-44ee-account-create-update-9ql9v\" (UID: \"c1627bec-94b0-4d6b-bbd4-178fa53884ff\") " pod="openstack/neutron-44ee-account-create-update-9ql9v" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.829782 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2p8h\" (UniqueName: \"kubernetes.io/projected/ab469031-4907-4d0c-b47f-5c34d3af3858-kube-api-access-r2p8h\") pod \"neutron-db-create-9l9bq\" (UID: \"ab469031-4907-4d0c-b47f-5c34d3af3858\") " pod="openstack/neutron-db-create-9l9bq" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.829853 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22z7q\" (UniqueName: \"kubernetes.io/projected/b5a76784-332d-479e-9cce-1f5acb1e828f-kube-api-access-22z7q\") pod \"keystone-db-sync-j8dhl\" (UID: \"b5a76784-332d-479e-9cce-1f5acb1e828f\") " pod="openstack/keystone-db-sync-j8dhl" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.829898 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1627bec-94b0-4d6b-bbd4-178fa53884ff-operator-scripts\") pod \"neutron-44ee-account-create-update-9ql9v\" (UID: \"c1627bec-94b0-4d6b-bbd4-178fa53884ff\") " pod="openstack/neutron-44ee-account-create-update-9ql9v" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.829978 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab469031-4907-4d0c-b47f-5c34d3af3858-operator-scripts\") pod \"neutron-db-create-9l9bq\" (UID: \"ab469031-4907-4d0c-b47f-5c34d3af3858\") " pod="openstack/neutron-db-create-9l9bq" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.830061 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a76784-332d-479e-9cce-1f5acb1e828f-config-data\") pod \"keystone-db-sync-j8dhl\" (UID: \"b5a76784-332d-479e-9cce-1f5acb1e828f\") " pod="openstack/keystone-db-sync-j8dhl" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.830093 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a76784-332d-479e-9cce-1f5acb1e828f-combined-ca-bundle\") pod \"keystone-db-sync-j8dhl\" (UID: \"b5a76784-332d-479e-9cce-1f5acb1e828f\") " pod="openstack/keystone-db-sync-j8dhl" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.842791 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a76784-332d-479e-9cce-1f5acb1e828f-config-data\") pod \"keystone-db-sync-j8dhl\" (UID: \"b5a76784-332d-479e-9cce-1f5acb1e828f\") " pod="openstack/keystone-db-sync-j8dhl" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.850377 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a76784-332d-479e-9cce-1f5acb1e828f-combined-ca-bundle\") pod \"keystone-db-sync-j8dhl\" (UID: \"b5a76784-332d-479e-9cce-1f5acb1e828f\") " pod="openstack/keystone-db-sync-j8dhl" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.863219 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22z7q\" (UniqueName: \"kubernetes.io/projected/b5a76784-332d-479e-9cce-1f5acb1e828f-kube-api-access-22z7q\") pod \"keystone-db-sync-j8dhl\" (UID: \"b5a76784-332d-479e-9cce-1f5acb1e828f\") " pod="openstack/keystone-db-sync-j8dhl" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.934351 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h89j\" (UniqueName: \"kubernetes.io/projected/c1627bec-94b0-4d6b-bbd4-178fa53884ff-kube-api-access-7h89j\") pod \"neutron-44ee-account-create-update-9ql9v\" (UID: \"c1627bec-94b0-4d6b-bbd4-178fa53884ff\") " pod="openstack/neutron-44ee-account-create-update-9ql9v" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.934903 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2p8h\" (UniqueName: \"kubernetes.io/projected/ab469031-4907-4d0c-b47f-5c34d3af3858-kube-api-access-r2p8h\") pod \"neutron-db-create-9l9bq\" (UID: \"ab469031-4907-4d0c-b47f-5c34d3af3858\") " pod="openstack/neutron-db-create-9l9bq" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.934969 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1627bec-94b0-4d6b-bbd4-178fa53884ff-operator-scripts\") pod \"neutron-44ee-account-create-update-9ql9v\" (UID: \"c1627bec-94b0-4d6b-bbd4-178fa53884ff\") " pod="openstack/neutron-44ee-account-create-update-9ql9v" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.936828 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1627bec-94b0-4d6b-bbd4-178fa53884ff-operator-scripts\") pod \"neutron-44ee-account-create-update-9ql9v\" (UID: \"c1627bec-94b0-4d6b-bbd4-178fa53884ff\") " pod="openstack/neutron-44ee-account-create-update-9ql9v" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.937340 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab469031-4907-4d0c-b47f-5c34d3af3858-operator-scripts\") pod \"neutron-db-create-9l9bq\" (UID: \"ab469031-4907-4d0c-b47f-5c34d3af3858\") " pod="openstack/neutron-db-create-9l9bq" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.939630 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab469031-4907-4d0c-b47f-5c34d3af3858-operator-scripts\") pod \"neutron-db-create-9l9bq\" (UID: \"ab469031-4907-4d0c-b47f-5c34d3af3858\") " pod="openstack/neutron-db-create-9l9bq" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.968448 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-j8dhl" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.970505 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2p8h\" (UniqueName: \"kubernetes.io/projected/ab469031-4907-4d0c-b47f-5c34d3af3858-kube-api-access-r2p8h\") pod \"neutron-db-create-9l9bq\" (UID: \"ab469031-4907-4d0c-b47f-5c34d3af3858\") " pod="openstack/neutron-db-create-9l9bq" Dec 16 13:06:18 crc kubenswrapper[4757]: I1216 13:06:18.993359 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h89j\" (UniqueName: \"kubernetes.io/projected/c1627bec-94b0-4d6b-bbd4-178fa53884ff-kube-api-access-7h89j\") pod \"neutron-44ee-account-create-update-9ql9v\" (UID: \"c1627bec-94b0-4d6b-bbd4-178fa53884ff\") " pod="openstack/neutron-44ee-account-create-update-9ql9v" Dec 16 13:06:19 crc kubenswrapper[4757]: I1216 13:06:19.029983 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0e47314-c60b-48e0-a6c1-b0d955025f45" path="/var/lib/kubelet/pods/d0e47314-c60b-48e0-a6c1-b0d955025f45/volumes" Dec 16 13:06:19 crc kubenswrapper[4757]: I1216 13:06:19.061270 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9l9bq" Dec 16 13:06:19 crc kubenswrapper[4757]: I1216 13:06:19.129528 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-44ee-account-create-update-9ql9v" Dec 16 13:06:19 crc kubenswrapper[4757]: I1216 13:06:19.175598 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ctbr2"] Dec 16 13:06:19 crc kubenswrapper[4757]: I1216 13:06:19.195448 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-35f9-account-create-update-8lns5"] Dec 16 13:06:19 crc kubenswrapper[4757]: I1216 13:06:19.553926 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-phhth"] Dec 16 13:06:19 crc kubenswrapper[4757]: I1216 13:06:19.574965 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e7fa-account-create-update-4hn76"] Dec 16 13:06:19 crc kubenswrapper[4757]: W1216 13:06:19.607359 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod292c184a_83d5_46a5_a309_f72088414fe7.slice/crio-e8e189e270f02e282cd06c333b65622908243a82b6dd3ca34ae74a6c8911bcff WatchSource:0}: Error finding container e8e189e270f02e282cd06c333b65622908243a82b6dd3ca34ae74a6c8911bcff: Status 404 returned error can't find the container with id e8e189e270f02e282cd06c333b65622908243a82b6dd3ca34ae74a6c8911bcff Dec 16 13:06:19 crc kubenswrapper[4757]: I1216 13:06:19.800189 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-35f9-account-create-update-8lns5" event={"ID":"39f33d4c-7376-4b9c-bb1f-192bcefc1b77","Type":"ContainerStarted","Data":"89acdcc06d286f115e592eb1bfe34a32dd4efd8db259d00066404108efeec215"} Dec 16 13:06:19 crc kubenswrapper[4757]: I1216 13:06:19.802617 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-phhth" event={"ID":"7f76f2d4-df5a-46a3-8a3f-7c326a48b045","Type":"ContainerStarted","Data":"83b52a8cbbe2b2e869ce6144f5ad645f37665c3b25101a6b9d5f17779f33d81b"} Dec 16 13:06:19 crc kubenswrapper[4757]: I1216 13:06:19.803643 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e7fa-account-create-update-4hn76" event={"ID":"292c184a-83d5-46a5-a309-f72088414fe7","Type":"ContainerStarted","Data":"e8e189e270f02e282cd06c333b65622908243a82b6dd3ca34ae74a6c8911bcff"} Dec 16 13:06:19 crc kubenswrapper[4757]: I1216 13:06:19.822671 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-9l9bq"] Dec 16 13:06:19 crc kubenswrapper[4757]: I1216 13:06:19.833759 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-j8dhl"] Dec 16 13:06:19 crc kubenswrapper[4757]: I1216 13:06:19.835165 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"510c5136-4ca0-49c9-ba30-1cafb624d71f","Type":"ContainerStarted","Data":"cb8dda816560efca7af6a9883dd9a3451c823d9ac15b6d64a4eff9cf61ffda11"} Dec 16 13:06:19 crc kubenswrapper[4757]: I1216 13:06:19.839236 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ctbr2" event={"ID":"9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8","Type":"ContainerStarted","Data":"021ee6489cc9570e9c07b435fa84a75cc3afae69cf50edc30bf904753fad9504"} Dec 16 13:06:19 crc kubenswrapper[4757]: I1216 13:06:19.952956 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-44ee-account-create-update-9ql9v"] Dec 16 13:06:19 crc kubenswrapper[4757]: W1216 13:06:19.972625 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1627bec_94b0_4d6b_bbd4_178fa53884ff.slice/crio-bd982e12f7c58b4bddf7dfc00f5dd9be637353815f70a511209b8cd5c2c2f144 WatchSource:0}: Error finding container bd982e12f7c58b4bddf7dfc00f5dd9be637353815f70a511209b8cd5c2c2f144: Status 404 returned error can't find the container with id bd982e12f7c58b4bddf7dfc00f5dd9be637353815f70a511209b8cd5c2c2f144 Dec 16 13:06:20 crc kubenswrapper[4757]: I1216 13:06:20.851347 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e7fa-account-create-update-4hn76" event={"ID":"292c184a-83d5-46a5-a309-f72088414fe7","Type":"ContainerStarted","Data":"a1d23ada33990caf50e3e437704cff1d9bb4630f199e164c03f149247b429bdc"} Dec 16 13:06:20 crc kubenswrapper[4757]: I1216 13:06:20.856835 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-44ee-account-create-update-9ql9v" event={"ID":"c1627bec-94b0-4d6b-bbd4-178fa53884ff","Type":"ContainerStarted","Data":"88acc2605ef2abf8cf35f0edf60f1ac278643cd7f4d05fdb449bb7358e0220e3"} Dec 16 13:06:20 crc kubenswrapper[4757]: I1216 13:06:20.856884 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-44ee-account-create-update-9ql9v" event={"ID":"c1627bec-94b0-4d6b-bbd4-178fa53884ff","Type":"ContainerStarted","Data":"bd982e12f7c58b4bddf7dfc00f5dd9be637353815f70a511209b8cd5c2c2f144"} Dec 16 13:06:20 crc kubenswrapper[4757]: I1216 13:06:20.860349 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9l9bq" event={"ID":"ab469031-4907-4d0c-b47f-5c34d3af3858","Type":"ContainerStarted","Data":"1755d59f079f71e8e90e60d6f83d6f422fb8342e1841643a23ba8672a74519ec"} Dec 16 13:06:20 crc kubenswrapper[4757]: I1216 13:06:20.860392 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9l9bq" event={"ID":"ab469031-4907-4d0c-b47f-5c34d3af3858","Type":"ContainerStarted","Data":"42ffa7eca6d669dff36a4b53f88d18ac1c909cfc5b638259f5178c531ba4e2c2"} Dec 16 13:06:20 crc kubenswrapper[4757]: I1216 13:06:20.872308 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-e7fa-account-create-update-4hn76" podStartSLOduration=2.872291638 podStartE2EDuration="2.872291638s" podCreationTimestamp="2025-12-16 13:06:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:06:20.870458625 +0000 UTC m=+1166.298202421" watchObservedRunningTime="2025-12-16 13:06:20.872291638 +0000 UTC m=+1166.300035434" Dec 16 13:06:20 crc kubenswrapper[4757]: I1216 13:06:20.878722 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"510c5136-4ca0-49c9-ba30-1cafb624d71f","Type":"ContainerStarted","Data":"df0e5b29aeb23dbd4aba07aaddfb0cc56ba6c24a9e26d884702c7cc78a6de61b"} Dec 16 13:06:20 crc kubenswrapper[4757]: I1216 13:06:20.878756 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"510c5136-4ca0-49c9-ba30-1cafb624d71f","Type":"ContainerStarted","Data":"d68dbd137b106c905a8b7e17907c1a8c3274928e5a024a9dcc9b67ee4205c518"} Dec 16 13:06:20 crc kubenswrapper[4757]: I1216 13:06:20.884399 4757 generic.go:334] "Generic (PLEG): container finished" podID="9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8" containerID="d231d340be5da33f9ad59b43d837ac9ddeb1a295101f8566161166a846910f49" exitCode=0 Dec 16 13:06:20 crc kubenswrapper[4757]: I1216 13:06:20.884489 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ctbr2" event={"ID":"9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8","Type":"ContainerDied","Data":"d231d340be5da33f9ad59b43d837ac9ddeb1a295101f8566161166a846910f49"} Dec 16 13:06:20 crc kubenswrapper[4757]: I1216 13:06:20.885450 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-44ee-account-create-update-9ql9v" podStartSLOduration=2.885433738 podStartE2EDuration="2.885433738s" podCreationTimestamp="2025-12-16 13:06:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:06:20.884351172 +0000 UTC m=+1166.312094978" watchObservedRunningTime="2025-12-16 13:06:20.885433738 +0000 UTC m=+1166.313177534" Dec 16 13:06:20 crc kubenswrapper[4757]: I1216 13:06:20.890477 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-35f9-account-create-update-8lns5" event={"ID":"39f33d4c-7376-4b9c-bb1f-192bcefc1b77","Type":"ContainerStarted","Data":"0beebbc3b3fac544c9c5650a3b010b51648dba01671ca67035230d4eb4e8dfff"} Dec 16 13:06:20 crc kubenswrapper[4757]: I1216 13:06:20.892378 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-j8dhl" event={"ID":"b5a76784-332d-479e-9cce-1f5acb1e828f","Type":"ContainerStarted","Data":"384b5a1631e80c668f16f6d7b2d6f26224de1b78c5527fa5d9434fa4e8907009"} Dec 16 13:06:20 crc kubenswrapper[4757]: I1216 13:06:20.894936 4757 generic.go:334] "Generic (PLEG): container finished" podID="7f76f2d4-df5a-46a3-8a3f-7c326a48b045" containerID="5431d1e9740d00772d8917012d0e343e3e5d7f745e54749265d56f18fe08f951" exitCode=0 Dec 16 13:06:20 crc kubenswrapper[4757]: I1216 13:06:20.894982 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-phhth" event={"ID":"7f76f2d4-df5a-46a3-8a3f-7c326a48b045","Type":"ContainerDied","Data":"5431d1e9740d00772d8917012d0e343e3e5d7f745e54749265d56f18fe08f951"} Dec 16 13:06:20 crc kubenswrapper[4757]: I1216 13:06:20.995143 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-35f9-account-create-update-8lns5" podStartSLOduration=2.995121493 podStartE2EDuration="2.995121493s" podCreationTimestamp="2025-12-16 13:06:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:06:20.973249647 +0000 UTC m=+1166.400993463" watchObservedRunningTime="2025-12-16 13:06:20.995121493 +0000 UTC m=+1166.422865289" Dec 16 13:06:21 crc kubenswrapper[4757]: I1216 13:06:21.906190 4757 generic.go:334] "Generic (PLEG): container finished" podID="c1627bec-94b0-4d6b-bbd4-178fa53884ff" containerID="88acc2605ef2abf8cf35f0edf60f1ac278643cd7f4d05fdb449bb7358e0220e3" exitCode=0 Dec 16 13:06:21 crc kubenswrapper[4757]: I1216 13:06:21.906873 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-44ee-account-create-update-9ql9v" event={"ID":"c1627bec-94b0-4d6b-bbd4-178fa53884ff","Type":"ContainerDied","Data":"88acc2605ef2abf8cf35f0edf60f1ac278643cd7f4d05fdb449bb7358e0220e3"} Dec 16 13:06:21 crc kubenswrapper[4757]: I1216 13:06:21.913395 4757 generic.go:334] "Generic (PLEG): container finished" podID="ab469031-4907-4d0c-b47f-5c34d3af3858" containerID="1755d59f079f71e8e90e60d6f83d6f422fb8342e1841643a23ba8672a74519ec" exitCode=0 Dec 16 13:06:21 crc kubenswrapper[4757]: I1216 13:06:21.913520 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9l9bq" event={"ID":"ab469031-4907-4d0c-b47f-5c34d3af3858","Type":"ContainerDied","Data":"1755d59f079f71e8e90e60d6f83d6f422fb8342e1841643a23ba8672a74519ec"} Dec 16 13:06:21 crc kubenswrapper[4757]: I1216 13:06:21.933291 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"510c5136-4ca0-49c9-ba30-1cafb624d71f","Type":"ContainerStarted","Data":"653affab9b15ab94a90f62db5c990141703f7e93dc1b7e41557ecce2f8171b39"} Dec 16 13:06:21 crc kubenswrapper[4757]: I1216 13:06:21.938439 4757 generic.go:334] "Generic (PLEG): container finished" podID="39f33d4c-7376-4b9c-bb1f-192bcefc1b77" containerID="0beebbc3b3fac544c9c5650a3b010b51648dba01671ca67035230d4eb4e8dfff" exitCode=0 Dec 16 13:06:21 crc kubenswrapper[4757]: I1216 13:06:21.938537 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-35f9-account-create-update-8lns5" event={"ID":"39f33d4c-7376-4b9c-bb1f-192bcefc1b77","Type":"ContainerDied","Data":"0beebbc3b3fac544c9c5650a3b010b51648dba01671ca67035230d4eb4e8dfff"} Dec 16 13:06:21 crc kubenswrapper[4757]: I1216 13:06:21.940872 4757 generic.go:334] "Generic (PLEG): container finished" podID="292c184a-83d5-46a5-a309-f72088414fe7" containerID="a1d23ada33990caf50e3e437704cff1d9bb4630f199e164c03f149247b429bdc" exitCode=0 Dec 16 13:06:21 crc kubenswrapper[4757]: I1216 13:06:21.941142 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e7fa-account-create-update-4hn76" event={"ID":"292c184a-83d5-46a5-a309-f72088414fe7","Type":"ContainerDied","Data":"a1d23ada33990caf50e3e437704cff1d9bb4630f199e164c03f149247b429bdc"} Dec 16 13:06:21 crc kubenswrapper[4757]: I1216 13:06:21.971338 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=43.354192682 podStartE2EDuration="48.97131831s" podCreationTimestamp="2025-12-16 13:05:33 +0000 UTC" firstStartedPulling="2025-12-16 13:06:09.815391794 +0000 UTC m=+1155.243135590" lastFinishedPulling="2025-12-16 13:06:15.432517422 +0000 UTC m=+1160.860261218" observedRunningTime="2025-12-16 13:06:21.970221325 +0000 UTC m=+1167.397965131" watchObservedRunningTime="2025-12-16 13:06:21.97131831 +0000 UTC m=+1167.399062106" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.301524 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-4rpn8"] Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.303323 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.316785 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.330412 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-config\") pod \"dnsmasq-dns-5c79d794d7-4rpn8\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.330489 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-4rpn8\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.330569 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-4rpn8\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.330595 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-4rpn8\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.330617 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpnpb\" (UniqueName: \"kubernetes.io/projected/8efffcc7-626b-47bf-aa44-c44a74763f9a-kube-api-access-gpnpb\") pod \"dnsmasq-dns-5c79d794d7-4rpn8\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.330645 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-4rpn8\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.332882 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-4rpn8"] Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.432402 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpnpb\" (UniqueName: \"kubernetes.io/projected/8efffcc7-626b-47bf-aa44-c44a74763f9a-kube-api-access-gpnpb\") pod \"dnsmasq-dns-5c79d794d7-4rpn8\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.432464 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-4rpn8\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.432535 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-config\") pod \"dnsmasq-dns-5c79d794d7-4rpn8\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.432591 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-4rpn8\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.432675 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-4rpn8\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.432700 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-4rpn8\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.434379 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-4rpn8\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.437355 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-4rpn8\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.437461 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-config\") pod \"dnsmasq-dns-5c79d794d7-4rpn8\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.438090 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-4rpn8\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.438314 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-4rpn8\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.445034 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9l9bq" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.456961 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ctbr2" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.468391 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpnpb\" (UniqueName: \"kubernetes.io/projected/8efffcc7-626b-47bf-aa44-c44a74763f9a-kube-api-access-gpnpb\") pod \"dnsmasq-dns-5c79d794d7-4rpn8\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.484731 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-phhth" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.533325 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab469031-4907-4d0c-b47f-5c34d3af3858-operator-scripts\") pod \"ab469031-4907-4d0c-b47f-5c34d3af3858\" (UID: \"ab469031-4907-4d0c-b47f-5c34d3af3858\") " Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.533517 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2p8h\" (UniqueName: \"kubernetes.io/projected/ab469031-4907-4d0c-b47f-5c34d3af3858-kube-api-access-r2p8h\") pod \"ab469031-4907-4d0c-b47f-5c34d3af3858\" (UID: \"ab469031-4907-4d0c-b47f-5c34d3af3858\") " Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.533642 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh8x2\" (UniqueName: \"kubernetes.io/projected/9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8-kube-api-access-mh8x2\") pod \"9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8\" (UID: \"9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8\") " Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.533772 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f76f2d4-df5a-46a3-8a3f-7c326a48b045-operator-scripts\") pod \"7f76f2d4-df5a-46a3-8a3f-7c326a48b045\" (UID: \"7f76f2d4-df5a-46a3-8a3f-7c326a48b045\") " Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.534319 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f76f2d4-df5a-46a3-8a3f-7c326a48b045-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f76f2d4-df5a-46a3-8a3f-7c326a48b045" (UID: "7f76f2d4-df5a-46a3-8a3f-7c326a48b045"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.534393 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sgqn\" (UniqueName: \"kubernetes.io/projected/7f76f2d4-df5a-46a3-8a3f-7c326a48b045-kube-api-access-8sgqn\") pod \"7f76f2d4-df5a-46a3-8a3f-7c326a48b045\" (UID: \"7f76f2d4-df5a-46a3-8a3f-7c326a48b045\") " Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.534422 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8-operator-scripts\") pod \"9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8\" (UID: \"9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8\") " Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.534975 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8" (UID: "9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.535333 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab469031-4907-4d0c-b47f-5c34d3af3858-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ab469031-4907-4d0c-b47f-5c34d3af3858" (UID: "ab469031-4907-4d0c-b47f-5c34d3af3858"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.538246 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab469031-4907-4d0c-b47f-5c34d3af3858-kube-api-access-r2p8h" (OuterVolumeSpecName: "kube-api-access-r2p8h") pod "ab469031-4907-4d0c-b47f-5c34d3af3858" (UID: "ab469031-4907-4d0c-b47f-5c34d3af3858"). InnerVolumeSpecName "kube-api-access-r2p8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.539341 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f76f2d4-df5a-46a3-8a3f-7c326a48b045-kube-api-access-8sgqn" (OuterVolumeSpecName: "kube-api-access-8sgqn") pod "7f76f2d4-df5a-46a3-8a3f-7c326a48b045" (UID: "7f76f2d4-df5a-46a3-8a3f-7c326a48b045"). InnerVolumeSpecName "kube-api-access-8sgqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.539392 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8-kube-api-access-mh8x2" (OuterVolumeSpecName: "kube-api-access-mh8x2") pod "9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8" (UID: "9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8"). InnerVolumeSpecName "kube-api-access-mh8x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.541124 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2p8h\" (UniqueName: \"kubernetes.io/projected/ab469031-4907-4d0c-b47f-5c34d3af3858-kube-api-access-r2p8h\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.541155 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh8x2\" (UniqueName: \"kubernetes.io/projected/9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8-kube-api-access-mh8x2\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.541171 4757 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f76f2d4-df5a-46a3-8a3f-7c326a48b045-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.541183 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sgqn\" (UniqueName: \"kubernetes.io/projected/7f76f2d4-df5a-46a3-8a3f-7c326a48b045-kube-api-access-8sgqn\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.541195 4757 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.541207 4757 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab469031-4907-4d0c-b47f-5c34d3af3858-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.713757 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.966031 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9l9bq" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.967926 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9l9bq" event={"ID":"ab469031-4907-4d0c-b47f-5c34d3af3858","Type":"ContainerDied","Data":"42ffa7eca6d669dff36a4b53f88d18ac1c909cfc5b638259f5178c531ba4e2c2"} Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.967972 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42ffa7eca6d669dff36a4b53f88d18ac1c909cfc5b638259f5178c531ba4e2c2" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.986298 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ctbr2" event={"ID":"9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8","Type":"ContainerDied","Data":"021ee6489cc9570e9c07b435fa84a75cc3afae69cf50edc30bf904753fad9504"} Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.986337 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="021ee6489cc9570e9c07b435fa84a75cc3afae69cf50edc30bf904753fad9504" Dec 16 13:06:22 crc kubenswrapper[4757]: I1216 13:06:22.986459 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ctbr2" Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.003058 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-phhth" event={"ID":"7f76f2d4-df5a-46a3-8a3f-7c326a48b045","Type":"ContainerDied","Data":"83b52a8cbbe2b2e869ce6144f5ad645f37665c3b25101a6b9d5f17779f33d81b"} Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.003126 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83b52a8cbbe2b2e869ce6144f5ad645f37665c3b25101a6b9d5f17779f33d81b" Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.003289 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-phhth" Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.010655 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-54js6" event={"ID":"fe9cebdb-26c9-4618-9640-5e17d5976d12","Type":"ContainerStarted","Data":"6e41157660f62a54e1cee8244fb66a8511e105080f30e5e8a7cdb0bfb294e497"} Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.046805 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-54js6" podStartSLOduration=2.248218093 podStartE2EDuration="32.046781877s" podCreationTimestamp="2025-12-16 13:05:51 +0000 UTC" firstStartedPulling="2025-12-16 13:05:51.963830062 +0000 UTC m=+1137.391573858" lastFinishedPulling="2025-12-16 13:06:21.762393846 +0000 UTC m=+1167.190137642" observedRunningTime="2025-12-16 13:06:23.037806096 +0000 UTC m=+1168.465549892" watchObservedRunningTime="2025-12-16 13:06:23.046781877 +0000 UTC m=+1168.474525683" Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.262845 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-4rpn8"] Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.582453 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-35f9-account-create-update-8lns5" Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.628654 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e7fa-account-create-update-4hn76" Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.631392 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-44ee-account-create-update-9ql9v" Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.676058 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1627bec-94b0-4d6b-bbd4-178fa53884ff-operator-scripts\") pod \"c1627bec-94b0-4d6b-bbd4-178fa53884ff\" (UID: \"c1627bec-94b0-4d6b-bbd4-178fa53884ff\") " Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.676126 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9lgq\" (UniqueName: \"kubernetes.io/projected/39f33d4c-7376-4b9c-bb1f-192bcefc1b77-kube-api-access-r9lgq\") pod \"39f33d4c-7376-4b9c-bb1f-192bcefc1b77\" (UID: \"39f33d4c-7376-4b9c-bb1f-192bcefc1b77\") " Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.676265 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h89j\" (UniqueName: \"kubernetes.io/projected/c1627bec-94b0-4d6b-bbd4-178fa53884ff-kube-api-access-7h89j\") pod \"c1627bec-94b0-4d6b-bbd4-178fa53884ff\" (UID: \"c1627bec-94b0-4d6b-bbd4-178fa53884ff\") " Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.676320 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39f33d4c-7376-4b9c-bb1f-192bcefc1b77-operator-scripts\") pod \"39f33d4c-7376-4b9c-bb1f-192bcefc1b77\" (UID: \"39f33d4c-7376-4b9c-bb1f-192bcefc1b77\") " Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.676382 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/292c184a-83d5-46a5-a309-f72088414fe7-operator-scripts\") pod \"292c184a-83d5-46a5-a309-f72088414fe7\" (UID: \"292c184a-83d5-46a5-a309-f72088414fe7\") " Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.676420 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnk5m\" (UniqueName: \"kubernetes.io/projected/292c184a-83d5-46a5-a309-f72088414fe7-kube-api-access-nnk5m\") pod \"292c184a-83d5-46a5-a309-f72088414fe7\" (UID: \"292c184a-83d5-46a5-a309-f72088414fe7\") " Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.680190 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/292c184a-83d5-46a5-a309-f72088414fe7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "292c184a-83d5-46a5-a309-f72088414fe7" (UID: "292c184a-83d5-46a5-a309-f72088414fe7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.680271 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1627bec-94b0-4d6b-bbd4-178fa53884ff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c1627bec-94b0-4d6b-bbd4-178fa53884ff" (UID: "c1627bec-94b0-4d6b-bbd4-178fa53884ff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.681709 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1627bec-94b0-4d6b-bbd4-178fa53884ff-kube-api-access-7h89j" (OuterVolumeSpecName: "kube-api-access-7h89j") pod "c1627bec-94b0-4d6b-bbd4-178fa53884ff" (UID: "c1627bec-94b0-4d6b-bbd4-178fa53884ff"). InnerVolumeSpecName "kube-api-access-7h89j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.682315 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/292c184a-83d5-46a5-a309-f72088414fe7-kube-api-access-nnk5m" (OuterVolumeSpecName: "kube-api-access-nnk5m") pod "292c184a-83d5-46a5-a309-f72088414fe7" (UID: "292c184a-83d5-46a5-a309-f72088414fe7"). InnerVolumeSpecName "kube-api-access-nnk5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.682585 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39f33d4c-7376-4b9c-bb1f-192bcefc1b77-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39f33d4c-7376-4b9c-bb1f-192bcefc1b77" (UID: "39f33d4c-7376-4b9c-bb1f-192bcefc1b77"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.683319 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39f33d4c-7376-4b9c-bb1f-192bcefc1b77-kube-api-access-r9lgq" (OuterVolumeSpecName: "kube-api-access-r9lgq") pod "39f33d4c-7376-4b9c-bb1f-192bcefc1b77" (UID: "39f33d4c-7376-4b9c-bb1f-192bcefc1b77"). InnerVolumeSpecName "kube-api-access-r9lgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.778796 4757 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1627bec-94b0-4d6b-bbd4-178fa53884ff-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.778835 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9lgq\" (UniqueName: \"kubernetes.io/projected/39f33d4c-7376-4b9c-bb1f-192bcefc1b77-kube-api-access-r9lgq\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.778850 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h89j\" (UniqueName: \"kubernetes.io/projected/c1627bec-94b0-4d6b-bbd4-178fa53884ff-kube-api-access-7h89j\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.778862 4757 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39f33d4c-7376-4b9c-bb1f-192bcefc1b77-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.778873 4757 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/292c184a-83d5-46a5-a309-f72088414fe7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:23 crc kubenswrapper[4757]: I1216 13:06:23.778884 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnk5m\" (UniqueName: \"kubernetes.io/projected/292c184a-83d5-46a5-a309-f72088414fe7-kube-api-access-nnk5m\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:24 crc kubenswrapper[4757]: I1216 13:06:24.022066 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" event={"ID":"8efffcc7-626b-47bf-aa44-c44a74763f9a","Type":"ContainerStarted","Data":"bf4a601cf4c61b45c9ef44c1dc5c22762ff4cfec80f282e3db80ad0e629e63c9"} Dec 16 13:06:24 crc kubenswrapper[4757]: I1216 13:06:24.023270 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-44ee-account-create-update-9ql9v" event={"ID":"c1627bec-94b0-4d6b-bbd4-178fa53884ff","Type":"ContainerDied","Data":"bd982e12f7c58b4bddf7dfc00f5dd9be637353815f70a511209b8cd5c2c2f144"} Dec 16 13:06:24 crc kubenswrapper[4757]: I1216 13:06:24.023288 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-44ee-account-create-update-9ql9v" Dec 16 13:06:24 crc kubenswrapper[4757]: I1216 13:06:24.023306 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd982e12f7c58b4bddf7dfc00f5dd9be637353815f70a511209b8cd5c2c2f144" Dec 16 13:06:24 crc kubenswrapper[4757]: I1216 13:06:24.024961 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-35f9-account-create-update-8lns5" event={"ID":"39f33d4c-7376-4b9c-bb1f-192bcefc1b77","Type":"ContainerDied","Data":"89acdcc06d286f115e592eb1bfe34a32dd4efd8db259d00066404108efeec215"} Dec 16 13:06:24 crc kubenswrapper[4757]: I1216 13:06:24.024986 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89acdcc06d286f115e592eb1bfe34a32dd4efd8db259d00066404108efeec215" Dec 16 13:06:24 crc kubenswrapper[4757]: I1216 13:06:24.024986 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-35f9-account-create-update-8lns5" Dec 16 13:06:24 crc kubenswrapper[4757]: I1216 13:06:24.026318 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e7fa-account-create-update-4hn76" event={"ID":"292c184a-83d5-46a5-a309-f72088414fe7","Type":"ContainerDied","Data":"e8e189e270f02e282cd06c333b65622908243a82b6dd3ca34ae74a6c8911bcff"} Dec 16 13:06:24 crc kubenswrapper[4757]: I1216 13:06:24.026342 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8e189e270f02e282cd06c333b65622908243a82b6dd3ca34ae74a6c8911bcff" Dec 16 13:06:24 crc kubenswrapper[4757]: I1216 13:06:24.026383 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e7fa-account-create-update-4hn76" Dec 16 13:06:25 crc kubenswrapper[4757]: I1216 13:06:25.037967 4757 generic.go:334] "Generic (PLEG): container finished" podID="8efffcc7-626b-47bf-aa44-c44a74763f9a" containerID="37e2c93447e8030756b33da98cea9caebf652010794bc2f62bec0274660d9053" exitCode=0 Dec 16 13:06:25 crc kubenswrapper[4757]: I1216 13:06:25.038179 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" event={"ID":"8efffcc7-626b-47bf-aa44-c44a74763f9a","Type":"ContainerDied","Data":"37e2c93447e8030756b33da98cea9caebf652010794bc2f62bec0274660d9053"} Dec 16 13:06:28 crc kubenswrapper[4757]: I1216 13:06:28.063550 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" event={"ID":"8efffcc7-626b-47bf-aa44-c44a74763f9a","Type":"ContainerStarted","Data":"4b0e7dbca92560f00211cde1b539ef23776d5e423469c331c705743192848902"} Dec 16 13:06:28 crc kubenswrapper[4757]: I1216 13:06:28.063859 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:28 crc kubenswrapper[4757]: I1216 13:06:28.064719 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-j8dhl" event={"ID":"b5a76784-332d-479e-9cce-1f5acb1e828f","Type":"ContainerStarted","Data":"91f4d3761e0ea7dc6f6f468b0c0462a930abcff747338f88d16a9ab8bc8d6512"} Dec 16 13:06:28 crc kubenswrapper[4757]: I1216 13:06:28.090560 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" podStartSLOduration=6.090539705 podStartE2EDuration="6.090539705s" podCreationTimestamp="2025-12-16 13:06:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:06:28.083673602 +0000 UTC m=+1173.511417438" watchObservedRunningTime="2025-12-16 13:06:28.090539705 +0000 UTC m=+1173.518283501" Dec 16 13:06:28 crc kubenswrapper[4757]: I1216 13:06:28.099844 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-j8dhl" podStartSLOduration=2.230565063 podStartE2EDuration="10.099814221s" podCreationTimestamp="2025-12-16 13:06:18 +0000 UTC" firstStartedPulling="2025-12-16 13:06:19.912812236 +0000 UTC m=+1165.340556032" lastFinishedPulling="2025-12-16 13:06:27.782061394 +0000 UTC m=+1173.209805190" observedRunningTime="2025-12-16 13:06:28.098564943 +0000 UTC m=+1173.526308749" watchObservedRunningTime="2025-12-16 13:06:28.099814221 +0000 UTC m=+1173.527558017" Dec 16 13:06:32 crc kubenswrapper[4757]: I1216 13:06:32.716298 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:06:32 crc kubenswrapper[4757]: I1216 13:06:32.772386 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-g2xbp"] Dec 16 13:06:32 crc kubenswrapper[4757]: I1216 13:06:32.772624 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" podUID="2158bcd5-2f3c-4a1a-b8d7-b98fced4c240" containerName="dnsmasq-dns" containerID="cri-o://77b29cac38112414de492bbc46e4122b6f17403cad344525a6855eaeec15e3c9" gracePeriod=10 Dec 16 13:06:33 crc kubenswrapper[4757]: I1216 13:06:33.114920 4757 generic.go:334] "Generic (PLEG): container finished" podID="2158bcd5-2f3c-4a1a-b8d7-b98fced4c240" containerID="77b29cac38112414de492bbc46e4122b6f17403cad344525a6855eaeec15e3c9" exitCode=0 Dec 16 13:06:33 crc kubenswrapper[4757]: I1216 13:06:33.115247 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" event={"ID":"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240","Type":"ContainerDied","Data":"77b29cac38112414de492bbc46e4122b6f17403cad344525a6855eaeec15e3c9"} Dec 16 13:06:33 crc kubenswrapper[4757]: I1216 13:06:33.253276 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:06:33 crc kubenswrapper[4757]: I1216 13:06:33.396879 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-config\") pod \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " Dec 16 13:06:33 crc kubenswrapper[4757]: I1216 13:06:33.397962 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-ovsdbserver-sb\") pod \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " Dec 16 13:06:33 crc kubenswrapper[4757]: I1216 13:06:33.398293 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4277l\" (UniqueName: \"kubernetes.io/projected/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-kube-api-access-4277l\") pod \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " Dec 16 13:06:33 crc kubenswrapper[4757]: I1216 13:06:33.398317 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-dns-svc\") pod \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " Dec 16 13:06:33 crc kubenswrapper[4757]: I1216 13:06:33.398431 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-ovsdbserver-nb\") pod \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\" (UID: \"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240\") " Dec 16 13:06:33 crc kubenswrapper[4757]: I1216 13:06:33.420842 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-kube-api-access-4277l" (OuterVolumeSpecName: "kube-api-access-4277l") pod "2158bcd5-2f3c-4a1a-b8d7-b98fced4c240" (UID: "2158bcd5-2f3c-4a1a-b8d7-b98fced4c240"). InnerVolumeSpecName "kube-api-access-4277l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:06:33 crc kubenswrapper[4757]: I1216 13:06:33.449715 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2158bcd5-2f3c-4a1a-b8d7-b98fced4c240" (UID: "2158bcd5-2f3c-4a1a-b8d7-b98fced4c240"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:33 crc kubenswrapper[4757]: I1216 13:06:33.451731 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2158bcd5-2f3c-4a1a-b8d7-b98fced4c240" (UID: "2158bcd5-2f3c-4a1a-b8d7-b98fced4c240"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:33 crc kubenswrapper[4757]: I1216 13:06:33.454780 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-config" (OuterVolumeSpecName: "config") pod "2158bcd5-2f3c-4a1a-b8d7-b98fced4c240" (UID: "2158bcd5-2f3c-4a1a-b8d7-b98fced4c240"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:33 crc kubenswrapper[4757]: I1216 13:06:33.463997 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2158bcd5-2f3c-4a1a-b8d7-b98fced4c240" (UID: "2158bcd5-2f3c-4a1a-b8d7-b98fced4c240"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:33 crc kubenswrapper[4757]: I1216 13:06:33.500509 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:33 crc kubenswrapper[4757]: I1216 13:06:33.500547 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:33 crc kubenswrapper[4757]: I1216 13:06:33.500558 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4277l\" (UniqueName: \"kubernetes.io/projected/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-kube-api-access-4277l\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:33 crc kubenswrapper[4757]: I1216 13:06:33.500567 4757 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:33 crc kubenswrapper[4757]: I1216 13:06:33.500575 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:34 crc kubenswrapper[4757]: I1216 13:06:34.127113 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" event={"ID":"2158bcd5-2f3c-4a1a-b8d7-b98fced4c240","Type":"ContainerDied","Data":"f9bffc40f3a7f1251c13f79954d2f907577a02c933cb5aebc90c6e6120fb6ce0"} Dec 16 13:06:34 crc kubenswrapper[4757]: I1216 13:06:34.127181 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-g2xbp" Dec 16 13:06:34 crc kubenswrapper[4757]: I1216 13:06:34.127213 4757 scope.go:117] "RemoveContainer" containerID="77b29cac38112414de492bbc46e4122b6f17403cad344525a6855eaeec15e3c9" Dec 16 13:06:34 crc kubenswrapper[4757]: I1216 13:06:34.161787 4757 scope.go:117] "RemoveContainer" containerID="e39d0d04f18a1fc35ca9ec758cd3d29ce079ff14009ffcfa422a3406edff54e5" Dec 16 13:06:34 crc kubenswrapper[4757]: I1216 13:06:34.176476 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-g2xbp"] Dec 16 13:06:34 crc kubenswrapper[4757]: I1216 13:06:34.186204 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-g2xbp"] Dec 16 13:06:34 crc kubenswrapper[4757]: I1216 13:06:34.959748 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2158bcd5-2f3c-4a1a-b8d7-b98fced4c240" path="/var/lib/kubelet/pods/2158bcd5-2f3c-4a1a-b8d7-b98fced4c240/volumes" Dec 16 13:06:35 crc kubenswrapper[4757]: I1216 13:06:35.137558 4757 generic.go:334] "Generic (PLEG): container finished" podID="b5a76784-332d-479e-9cce-1f5acb1e828f" containerID="91f4d3761e0ea7dc6f6f468b0c0462a930abcff747338f88d16a9ab8bc8d6512" exitCode=0 Dec 16 13:06:35 crc kubenswrapper[4757]: I1216 13:06:35.137638 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-j8dhl" event={"ID":"b5a76784-332d-479e-9cce-1f5acb1e828f","Type":"ContainerDied","Data":"91f4d3761e0ea7dc6f6f468b0c0462a930abcff747338f88d16a9ab8bc8d6512"} Dec 16 13:06:36 crc kubenswrapper[4757]: I1216 13:06:36.410558 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-j8dhl" Dec 16 13:06:36 crc kubenswrapper[4757]: I1216 13:06:36.549907 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22z7q\" (UniqueName: \"kubernetes.io/projected/b5a76784-332d-479e-9cce-1f5acb1e828f-kube-api-access-22z7q\") pod \"b5a76784-332d-479e-9cce-1f5acb1e828f\" (UID: \"b5a76784-332d-479e-9cce-1f5acb1e828f\") " Dec 16 13:06:36 crc kubenswrapper[4757]: I1216 13:06:36.549986 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a76784-332d-479e-9cce-1f5acb1e828f-combined-ca-bundle\") pod \"b5a76784-332d-479e-9cce-1f5acb1e828f\" (UID: \"b5a76784-332d-479e-9cce-1f5acb1e828f\") " Dec 16 13:06:36 crc kubenswrapper[4757]: I1216 13:06:36.550180 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a76784-332d-479e-9cce-1f5acb1e828f-config-data\") pod \"b5a76784-332d-479e-9cce-1f5acb1e828f\" (UID: \"b5a76784-332d-479e-9cce-1f5acb1e828f\") " Dec 16 13:06:36 crc kubenswrapper[4757]: I1216 13:06:36.564878 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5a76784-332d-479e-9cce-1f5acb1e828f-kube-api-access-22z7q" (OuterVolumeSpecName: "kube-api-access-22z7q") pod "b5a76784-332d-479e-9cce-1f5acb1e828f" (UID: "b5a76784-332d-479e-9cce-1f5acb1e828f"). InnerVolumeSpecName "kube-api-access-22z7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:06:36 crc kubenswrapper[4757]: I1216 13:06:36.576398 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a76784-332d-479e-9cce-1f5acb1e828f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5a76784-332d-479e-9cce-1f5acb1e828f" (UID: "b5a76784-332d-479e-9cce-1f5acb1e828f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:06:36 crc kubenswrapper[4757]: I1216 13:06:36.598130 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a76784-332d-479e-9cce-1f5acb1e828f-config-data" (OuterVolumeSpecName: "config-data") pod "b5a76784-332d-479e-9cce-1f5acb1e828f" (UID: "b5a76784-332d-479e-9cce-1f5acb1e828f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:06:36 crc kubenswrapper[4757]: I1216 13:06:36.651692 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22z7q\" (UniqueName: \"kubernetes.io/projected/b5a76784-332d-479e-9cce-1f5acb1e828f-kube-api-access-22z7q\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:36 crc kubenswrapper[4757]: I1216 13:06:36.651729 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a76784-332d-479e-9cce-1f5acb1e828f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:36 crc kubenswrapper[4757]: I1216 13:06:36.651738 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a76784-332d-479e-9cce-1f5acb1e828f-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.154242 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-j8dhl" event={"ID":"b5a76784-332d-479e-9cce-1f5acb1e828f","Type":"ContainerDied","Data":"384b5a1631e80c668f16f6d7b2d6f26224de1b78c5527fa5d9434fa4e8907009"} Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.154549 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="384b5a1631e80c668f16f6d7b2d6f26224de1b78c5527fa5d9434fa4e8907009" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.154293 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-j8dhl" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.448672 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-w4tw2"] Dec 16 13:06:37 crc kubenswrapper[4757]: E1216 13:06:37.450105 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2158bcd5-2f3c-4a1a-b8d7-b98fced4c240" containerName="init" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.450206 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="2158bcd5-2f3c-4a1a-b8d7-b98fced4c240" containerName="init" Dec 16 13:06:37 crc kubenswrapper[4757]: E1216 13:06:37.450286 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1627bec-94b0-4d6b-bbd4-178fa53884ff" containerName="mariadb-account-create-update" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.450354 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1627bec-94b0-4d6b-bbd4-178fa53884ff" containerName="mariadb-account-create-update" Dec 16 13:06:37 crc kubenswrapper[4757]: E1216 13:06:37.450416 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f76f2d4-df5a-46a3-8a3f-7c326a48b045" containerName="mariadb-database-create" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.450473 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f76f2d4-df5a-46a3-8a3f-7c326a48b045" containerName="mariadb-database-create" Dec 16 13:06:37 crc kubenswrapper[4757]: E1216 13:06:37.450540 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5a76784-332d-479e-9cce-1f5acb1e828f" containerName="keystone-db-sync" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.450592 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5a76784-332d-479e-9cce-1f5acb1e828f" containerName="keystone-db-sync" Dec 16 13:06:37 crc kubenswrapper[4757]: E1216 13:06:37.450667 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f33d4c-7376-4b9c-bb1f-192bcefc1b77" containerName="mariadb-account-create-update" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.450741 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f33d4c-7376-4b9c-bb1f-192bcefc1b77" containerName="mariadb-account-create-update" Dec 16 13:06:37 crc kubenswrapper[4757]: E1216 13:06:37.450812 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="292c184a-83d5-46a5-a309-f72088414fe7" containerName="mariadb-account-create-update" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.450870 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="292c184a-83d5-46a5-a309-f72088414fe7" containerName="mariadb-account-create-update" Dec 16 13:06:37 crc kubenswrapper[4757]: E1216 13:06:37.450940 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab469031-4907-4d0c-b47f-5c34d3af3858" containerName="mariadb-database-create" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.450999 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab469031-4907-4d0c-b47f-5c34d3af3858" containerName="mariadb-database-create" Dec 16 13:06:37 crc kubenswrapper[4757]: E1216 13:06:37.451082 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2158bcd5-2f3c-4a1a-b8d7-b98fced4c240" containerName="dnsmasq-dns" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.451135 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="2158bcd5-2f3c-4a1a-b8d7-b98fced4c240" containerName="dnsmasq-dns" Dec 16 13:06:37 crc kubenswrapper[4757]: E1216 13:06:37.451191 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8" containerName="mariadb-database-create" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.451258 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8" containerName="mariadb-database-create" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.451542 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8" containerName="mariadb-database-create" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.451621 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="292c184a-83d5-46a5-a309-f72088414fe7" containerName="mariadb-account-create-update" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.451699 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="39f33d4c-7376-4b9c-bb1f-192bcefc1b77" containerName="mariadb-account-create-update" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.451762 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="2158bcd5-2f3c-4a1a-b8d7-b98fced4c240" containerName="dnsmasq-dns" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.451824 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5a76784-332d-479e-9cce-1f5acb1e828f" containerName="keystone-db-sync" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.451884 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1627bec-94b0-4d6b-bbd4-178fa53884ff" containerName="mariadb-account-create-update" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.451945 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab469031-4907-4d0c-b47f-5c34d3af3858" containerName="mariadb-database-create" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.452019 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f76f2d4-df5a-46a3-8a3f-7c326a48b045" containerName="mariadb-database-create" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.452622 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.456243 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.456271 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.456485 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rtqbc" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.459106 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.460483 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b868669f-4s69s"] Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.462196 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.471699 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-w4tw2"] Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.475968 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.486303 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-4s69s"] Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.567365 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-combined-ca-bundle\") pod \"keystone-bootstrap-w4tw2\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.567445 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-config\") pod \"dnsmasq-dns-5b868669f-4s69s\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.567475 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sjcn\" (UniqueName: \"kubernetes.io/projected/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-kube-api-access-8sjcn\") pod \"dnsmasq-dns-5b868669f-4s69s\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.567507 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-credential-keys\") pod \"keystone-bootstrap-w4tw2\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.567545 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-4s69s\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.567641 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-config-data\") pod \"keystone-bootstrap-w4tw2\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.567673 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-dns-svc\") pod \"dnsmasq-dns-5b868669f-4s69s\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.567698 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-4s69s\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.567739 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-scripts\") pod \"keystone-bootstrap-w4tw2\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.567774 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-fernet-keys\") pod \"keystone-bootstrap-w4tw2\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.567812 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-4s69s\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.567866 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tm4z\" (UniqueName: \"kubernetes.io/projected/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-kube-api-access-7tm4z\") pod \"keystone-bootstrap-w4tw2\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.650878 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8d6465865-4lfwj"] Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.652438 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.656687 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.657122 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-mvsgz" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.657280 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.657793 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.668969 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-config\") pod \"dnsmasq-dns-5b868669f-4s69s\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.669036 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sjcn\" (UniqueName: \"kubernetes.io/projected/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-kube-api-access-8sjcn\") pod \"dnsmasq-dns-5b868669f-4s69s\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.669070 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-credential-keys\") pod \"keystone-bootstrap-w4tw2\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.669108 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-4s69s\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.669190 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-config-data\") pod \"keystone-bootstrap-w4tw2\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.669229 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-dns-svc\") pod \"dnsmasq-dns-5b868669f-4s69s\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.669254 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-4s69s\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.669299 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-scripts\") pod \"keystone-bootstrap-w4tw2\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.669328 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-fernet-keys\") pod \"keystone-bootstrap-w4tw2\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.669362 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-4s69s\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.669413 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tm4z\" (UniqueName: \"kubernetes.io/projected/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-kube-api-access-7tm4z\") pod \"keystone-bootstrap-w4tw2\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.669436 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-combined-ca-bundle\") pod \"keystone-bootstrap-w4tw2\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.670990 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-dns-svc\") pod \"dnsmasq-dns-5b868669f-4s69s\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.671636 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-4s69s\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.675418 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-4s69s\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.676365 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-4s69s\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.680565 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-config\") pod \"dnsmasq-dns-5b868669f-4s69s\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.685455 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-credential-keys\") pod \"keystone-bootstrap-w4tw2\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.685542 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-combined-ca-bundle\") pod \"keystone-bootstrap-w4tw2\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.690174 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-fernet-keys\") pod \"keystone-bootstrap-w4tw2\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.696965 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-config-data\") pod \"keystone-bootstrap-w4tw2\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.705365 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-scripts\") pod \"keystone-bootstrap-w4tw2\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.712018 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8d6465865-4lfwj"] Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.733681 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tm4z\" (UniqueName: \"kubernetes.io/projected/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-kube-api-access-7tm4z\") pod \"keystone-bootstrap-w4tw2\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.745628 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sjcn\" (UniqueName: \"kubernetes.io/projected/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-kube-api-access-8sjcn\") pod \"dnsmasq-dns-5b868669f-4s69s\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.771181 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skgsj\" (UniqueName: \"kubernetes.io/projected/aa69e580-99e4-454b-9fa8-906d0410aea5-kube-api-access-skgsj\") pod \"horizon-8d6465865-4lfwj\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.771509 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa69e580-99e4-454b-9fa8-906d0410aea5-config-data\") pod \"horizon-8d6465865-4lfwj\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.771666 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa69e580-99e4-454b-9fa8-906d0410aea5-horizon-secret-key\") pod \"horizon-8d6465865-4lfwj\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.771798 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa69e580-99e4-454b-9fa8-906d0410aea5-scripts\") pod \"horizon-8d6465865-4lfwj\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.771947 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa69e580-99e4-454b-9fa8-906d0410aea5-logs\") pod \"horizon-8d6465865-4lfwj\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.780716 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.828625 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.834266 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-7lhwd"] Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.835374 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.851131 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qgfl8" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.851466 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.851599 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.869312 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-7lhwd"] Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.874670 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa69e580-99e4-454b-9fa8-906d0410aea5-logs\") pod \"horizon-8d6465865-4lfwj\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.874724 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skgsj\" (UniqueName: \"kubernetes.io/projected/aa69e580-99e4-454b-9fa8-906d0410aea5-kube-api-access-skgsj\") pod \"horizon-8d6465865-4lfwj\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.874760 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa69e580-99e4-454b-9fa8-906d0410aea5-config-data\") pod \"horizon-8d6465865-4lfwj\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.874803 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa69e580-99e4-454b-9fa8-906d0410aea5-horizon-secret-key\") pod \"horizon-8d6465865-4lfwj\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.874840 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa69e580-99e4-454b-9fa8-906d0410aea5-scripts\") pod \"horizon-8d6465865-4lfwj\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.875629 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa69e580-99e4-454b-9fa8-906d0410aea5-scripts\") pod \"horizon-8d6465865-4lfwj\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.875838 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa69e580-99e4-454b-9fa8-906d0410aea5-logs\") pod \"horizon-8d6465865-4lfwj\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.876912 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa69e580-99e4-454b-9fa8-906d0410aea5-config-data\") pod \"horizon-8d6465865-4lfwj\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.899624 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa69e580-99e4-454b-9fa8-906d0410aea5-horizon-secret-key\") pod \"horizon-8d6465865-4lfwj\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.915106 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-77668464d5-b2zbm"] Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.917450 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.949467 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skgsj\" (UniqueName: \"kubernetes.io/projected/aa69e580-99e4-454b-9fa8-906d0410aea5-kube-api-access-skgsj\") pod \"horizon-8d6465865-4lfwj\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.967600 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.973957 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77668464d5-b2zbm"] Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.975837 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpnll\" (UniqueName: \"kubernetes.io/projected/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-kube-api-access-bpnll\") pod \"cinder-db-sync-7lhwd\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.976325 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-scripts\") pod \"cinder-db-sync-7lhwd\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.976552 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-etc-machine-id\") pod \"cinder-db-sync-7lhwd\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.976706 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-config-data\") pod \"cinder-db-sync-7lhwd\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.976950 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-combined-ca-bundle\") pod \"cinder-db-sync-7lhwd\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:37 crc kubenswrapper[4757]: I1216 13:06:37.977127 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-db-sync-config-data\") pod \"cinder-db-sync-7lhwd\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.075979 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-4s69s"] Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.081313 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-combined-ca-bundle\") pod \"cinder-db-sync-7lhwd\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.081357 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-db-sync-config-data\") pod \"cinder-db-sync-7lhwd\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.081397 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73702eed-2df7-4b2a-8bc5-04e038078d7b-config-data\") pod \"horizon-77668464d5-b2zbm\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.081454 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73702eed-2df7-4b2a-8bc5-04e038078d7b-scripts\") pod \"horizon-77668464d5-b2zbm\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.081484 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpnll\" (UniqueName: \"kubernetes.io/projected/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-kube-api-access-bpnll\") pod \"cinder-db-sync-7lhwd\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.081507 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-scripts\") pod \"cinder-db-sync-7lhwd\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.081527 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfvdn\" (UniqueName: \"kubernetes.io/projected/73702eed-2df7-4b2a-8bc5-04e038078d7b-kube-api-access-cfvdn\") pod \"horizon-77668464d5-b2zbm\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.081588 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-etc-machine-id\") pod \"cinder-db-sync-7lhwd\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.081612 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/73702eed-2df7-4b2a-8bc5-04e038078d7b-horizon-secret-key\") pod \"horizon-77668464d5-b2zbm\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.081637 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-config-data\") pod \"cinder-db-sync-7lhwd\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.081683 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73702eed-2df7-4b2a-8bc5-04e038078d7b-logs\") pod \"horizon-77668464d5-b2zbm\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.083431 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-etc-machine-id\") pod \"cinder-db-sync-7lhwd\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.102804 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-combined-ca-bundle\") pod \"cinder-db-sync-7lhwd\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.103481 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-scripts\") pod \"cinder-db-sync-7lhwd\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.106931 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-config-data\") pod \"cinder-db-sync-7lhwd\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.107383 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-db-sync-config-data\") pod \"cinder-db-sync-7lhwd\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.107520 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-86cds"] Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.108780 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-86cds" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.122271 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.122574 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.123416 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2hwt5" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.148795 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpnll\" (UniqueName: \"kubernetes.io/projected/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-kube-api-access-bpnll\") pod \"cinder-db-sync-7lhwd\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.181325 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-pcl59"] Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.183016 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73702eed-2df7-4b2a-8bc5-04e038078d7b-logs\") pod \"horizon-77668464d5-b2zbm\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.183895 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73702eed-2df7-4b2a-8bc5-04e038078d7b-logs\") pod \"horizon-77668464d5-b2zbm\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.183937 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pcl59" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.187214 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73702eed-2df7-4b2a-8bc5-04e038078d7b-config-data\") pod \"horizon-77668464d5-b2zbm\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.187547 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73702eed-2df7-4b2a-8bc5-04e038078d7b-scripts\") pod \"horizon-77668464d5-b2zbm\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.187729 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfvdn\" (UniqueName: \"kubernetes.io/projected/73702eed-2df7-4b2a-8bc5-04e038078d7b-kube-api-access-cfvdn\") pod \"horizon-77668464d5-b2zbm\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.185047 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-tjbz5"] Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.194150 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.194259 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tjbz5" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.194558 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qwv54" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.195040 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/73702eed-2df7-4b2a-8bc5-04e038078d7b-horizon-secret-key\") pod \"horizon-77668464d5-b2zbm\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.196875 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73702eed-2df7-4b2a-8bc5-04e038078d7b-config-data\") pod \"horizon-77668464d5-b2zbm\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.197485 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.198625 4757 generic.go:334] "Generic (PLEG): container finished" podID="fe9cebdb-26c9-4618-9640-5e17d5976d12" containerID="6e41157660f62a54e1cee8244fb66a8511e105080f30e5e8a7cdb0bfb294e497" exitCode=0 Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.198634 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73702eed-2df7-4b2a-8bc5-04e038078d7b-scripts\") pod \"horizon-77668464d5-b2zbm\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.198674 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-54js6" event={"ID":"fe9cebdb-26c9-4618-9640-5e17d5976d12","Type":"ContainerDied","Data":"6e41157660f62a54e1cee8244fb66a8511e105080f30e5e8a7cdb0bfb294e497"} Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.203428 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-l49sv" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.203653 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.205759 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/73702eed-2df7-4b2a-8bc5-04e038078d7b-horizon-secret-key\") pod \"horizon-77668464d5-b2zbm\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.247107 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfvdn\" (UniqueName: \"kubernetes.io/projected/73702eed-2df7-4b2a-8bc5-04e038078d7b-kube-api-access-cfvdn\") pod \"horizon-77668464d5-b2zbm\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.251218 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.265085 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.305571 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc9a7054-7c7f-4e36-8d57-e095087a7878-config\") pod \"neutron-db-sync-pcl59\" (UID: \"fc9a7054-7c7f-4e36-8d57-e095087a7878\") " pod="openstack/neutron-db-sync-pcl59" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.305853 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25lt5\" (UniqueName: \"kubernetes.io/projected/25d57428-c378-4e57-87c5-f1fff2398cec-kube-api-access-25lt5\") pod \"placement-db-sync-86cds\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " pod="openstack/placement-db-sync-86cds" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.305961 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d57428-c378-4e57-87c5-f1fff2398cec-config-data\") pod \"placement-db-sync-86cds\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " pod="openstack/placement-db-sync-86cds" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.306063 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d57428-c378-4e57-87c5-f1fff2398cec-logs\") pod \"placement-db-sync-86cds\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " pod="openstack/placement-db-sync-86cds" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.306156 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362eaecb-4139-44f9-a651-3e14cc2d6ae2-combined-ca-bundle\") pod \"barbican-db-sync-tjbz5\" (UID: \"362eaecb-4139-44f9-a651-3e14cc2d6ae2\") " pod="openstack/barbican-db-sync-tjbz5" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.306286 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc9a7054-7c7f-4e36-8d57-e095087a7878-combined-ca-bundle\") pod \"neutron-db-sync-pcl59\" (UID: \"fc9a7054-7c7f-4e36-8d57-e095087a7878\") " pod="openstack/neutron-db-sync-pcl59" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.306471 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsmk8\" (UniqueName: \"kubernetes.io/projected/fc9a7054-7c7f-4e36-8d57-e095087a7878-kube-api-access-jsmk8\") pod \"neutron-db-sync-pcl59\" (UID: \"fc9a7054-7c7f-4e36-8d57-e095087a7878\") " pod="openstack/neutron-db-sync-pcl59" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.306759 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w9r5\" (UniqueName: \"kubernetes.io/projected/362eaecb-4139-44f9-a651-3e14cc2d6ae2-kube-api-access-5w9r5\") pod \"barbican-db-sync-tjbz5\" (UID: \"362eaecb-4139-44f9-a651-3e14cc2d6ae2\") " pod="openstack/barbican-db-sync-tjbz5" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.306927 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d57428-c378-4e57-87c5-f1fff2398cec-scripts\") pod \"placement-db-sync-86cds\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " pod="openstack/placement-db-sync-86cds" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.307226 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d57428-c378-4e57-87c5-f1fff2398cec-combined-ca-bundle\") pod \"placement-db-sync-86cds\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " pod="openstack/placement-db-sync-86cds" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.307277 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/362eaecb-4139-44f9-a651-3e14cc2d6ae2-db-sync-config-data\") pod \"barbican-db-sync-tjbz5\" (UID: \"362eaecb-4139-44f9-a651-3e14cc2d6ae2\") " pod="openstack/barbican-db-sync-tjbz5" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.327858 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-86cds"] Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.362087 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tjbz5"] Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.371645 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-pcl59"] Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.381315 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-s8xq7"] Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.385967 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.406941 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-s8xq7"] Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.412819 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc9a7054-7c7f-4e36-8d57-e095087a7878-config\") pod \"neutron-db-sync-pcl59\" (UID: \"fc9a7054-7c7f-4e36-8d57-e095087a7878\") " pod="openstack/neutron-db-sync-pcl59" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.412899 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25lt5\" (UniqueName: \"kubernetes.io/projected/25d57428-c378-4e57-87c5-f1fff2398cec-kube-api-access-25lt5\") pod \"placement-db-sync-86cds\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " pod="openstack/placement-db-sync-86cds" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.412928 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d57428-c378-4e57-87c5-f1fff2398cec-config-data\") pod \"placement-db-sync-86cds\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " pod="openstack/placement-db-sync-86cds" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.412952 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d57428-c378-4e57-87c5-f1fff2398cec-logs\") pod \"placement-db-sync-86cds\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " pod="openstack/placement-db-sync-86cds" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.412975 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362eaecb-4139-44f9-a651-3e14cc2d6ae2-combined-ca-bundle\") pod \"barbican-db-sync-tjbz5\" (UID: \"362eaecb-4139-44f9-a651-3e14cc2d6ae2\") " pod="openstack/barbican-db-sync-tjbz5" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.413024 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc9a7054-7c7f-4e36-8d57-e095087a7878-combined-ca-bundle\") pod \"neutron-db-sync-pcl59\" (UID: \"fc9a7054-7c7f-4e36-8d57-e095087a7878\") " pod="openstack/neutron-db-sync-pcl59" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.413047 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsmk8\" (UniqueName: \"kubernetes.io/projected/fc9a7054-7c7f-4e36-8d57-e095087a7878-kube-api-access-jsmk8\") pod \"neutron-db-sync-pcl59\" (UID: \"fc9a7054-7c7f-4e36-8d57-e095087a7878\") " pod="openstack/neutron-db-sync-pcl59" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.413138 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w9r5\" (UniqueName: \"kubernetes.io/projected/362eaecb-4139-44f9-a651-3e14cc2d6ae2-kube-api-access-5w9r5\") pod \"barbican-db-sync-tjbz5\" (UID: \"362eaecb-4139-44f9-a651-3e14cc2d6ae2\") " pod="openstack/barbican-db-sync-tjbz5" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.413164 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d57428-c378-4e57-87c5-f1fff2398cec-scripts\") pod \"placement-db-sync-86cds\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " pod="openstack/placement-db-sync-86cds" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.413209 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d57428-c378-4e57-87c5-f1fff2398cec-combined-ca-bundle\") pod \"placement-db-sync-86cds\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " pod="openstack/placement-db-sync-86cds" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.413261 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/362eaecb-4139-44f9-a651-3e14cc2d6ae2-db-sync-config-data\") pod \"barbican-db-sync-tjbz5\" (UID: \"362eaecb-4139-44f9-a651-3e14cc2d6ae2\") " pod="openstack/barbican-db-sync-tjbz5" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.418565 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d57428-c378-4e57-87c5-f1fff2398cec-logs\") pod \"placement-db-sync-86cds\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " pod="openstack/placement-db-sync-86cds" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.421743 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/362eaecb-4139-44f9-a651-3e14cc2d6ae2-db-sync-config-data\") pod \"barbican-db-sync-tjbz5\" (UID: \"362eaecb-4139-44f9-a651-3e14cc2d6ae2\") " pod="openstack/barbican-db-sync-tjbz5" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.424426 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.426966 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc9a7054-7c7f-4e36-8d57-e095087a7878-combined-ca-bundle\") pod \"neutron-db-sync-pcl59\" (UID: \"fc9a7054-7c7f-4e36-8d57-e095087a7878\") " pod="openstack/neutron-db-sync-pcl59" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.427919 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362eaecb-4139-44f9-a651-3e14cc2d6ae2-combined-ca-bundle\") pod \"barbican-db-sync-tjbz5\" (UID: \"362eaecb-4139-44f9-a651-3e14cc2d6ae2\") " pod="openstack/barbican-db-sync-tjbz5" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.428776 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d57428-c378-4e57-87c5-f1fff2398cec-config-data\") pod \"placement-db-sync-86cds\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " pod="openstack/placement-db-sync-86cds" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.431363 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d57428-c378-4e57-87c5-f1fff2398cec-scripts\") pod \"placement-db-sync-86cds\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " pod="openstack/placement-db-sync-86cds" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.431465 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d57428-c378-4e57-87c5-f1fff2398cec-combined-ca-bundle\") pod \"placement-db-sync-86cds\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " pod="openstack/placement-db-sync-86cds" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.434404 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc9a7054-7c7f-4e36-8d57-e095087a7878-config\") pod \"neutron-db-sync-pcl59\" (UID: \"fc9a7054-7c7f-4e36-8d57-e095087a7878\") " pod="openstack/neutron-db-sync-pcl59" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.437092 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.457085 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.458140 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsmk8\" (UniqueName: \"kubernetes.io/projected/fc9a7054-7c7f-4e36-8d57-e095087a7878-kube-api-access-jsmk8\") pod \"neutron-db-sync-pcl59\" (UID: \"fc9a7054-7c7f-4e36-8d57-e095087a7878\") " pod="openstack/neutron-db-sync-pcl59" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.459702 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.460070 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.464051 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25lt5\" (UniqueName: \"kubernetes.io/projected/25d57428-c378-4e57-87c5-f1fff2398cec-kube-api-access-25lt5\") pod \"placement-db-sync-86cds\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " pod="openstack/placement-db-sync-86cds" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.467211 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w9r5\" (UniqueName: \"kubernetes.io/projected/362eaecb-4139-44f9-a651-3e14cc2d6ae2-kube-api-access-5w9r5\") pod \"barbican-db-sync-tjbz5\" (UID: \"362eaecb-4139-44f9-a651-3e14cc2d6ae2\") " pod="openstack/barbican-db-sync-tjbz5" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.516135 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-config\") pod \"dnsmasq-dns-cf78879c9-s8xq7\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.516613 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-s8xq7\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.516845 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgv5c\" (UniqueName: \"kubernetes.io/projected/7ba28476-9739-461a-858c-9144c2470948-kube-api-access-cgv5c\") pod \"dnsmasq-dns-cf78879c9-s8xq7\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.516961 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-s8xq7\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.517079 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-dns-svc\") pod \"dnsmasq-dns-cf78879c9-s8xq7\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.517195 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-s8xq7\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.539604 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-86cds" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.570484 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pcl59" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.621585 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgv5c\" (UniqueName: \"kubernetes.io/projected/7ba28476-9739-461a-858c-9144c2470948-kube-api-access-cgv5c\") pod \"dnsmasq-dns-cf78879c9-s8xq7\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.621641 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.621686 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-s8xq7\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.621714 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-dns-svc\") pod \"dnsmasq-dns-cf78879c9-s8xq7\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.621762 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-config-data\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.621791 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-s8xq7\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.621813 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs2rv\" (UniqueName: \"kubernetes.io/projected/32bc2074-3d53-44b4-8e6c-c500a2617944-kube-api-access-vs2rv\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.621830 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-config\") pod \"dnsmasq-dns-cf78879c9-s8xq7\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.621847 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.621956 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bc2074-3d53-44b4-8e6c-c500a2617944-log-httpd\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.622066 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-scripts\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.622123 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bc2074-3d53-44b4-8e6c-c500a2617944-run-httpd\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.622162 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-s8xq7\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.622765 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-s8xq7\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.623115 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-s8xq7\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.623846 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-s8xq7\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.624117 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-config\") pod \"dnsmasq-dns-cf78879c9-s8xq7\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.627438 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-dns-svc\") pod \"dnsmasq-dns-cf78879c9-s8xq7\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.627814 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tjbz5" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.647168 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgv5c\" (UniqueName: \"kubernetes.io/projected/7ba28476-9739-461a-858c-9144c2470948-kube-api-access-cgv5c\") pod \"dnsmasq-dns-cf78879c9-s8xq7\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.713580 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.725094 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.725220 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-config-data\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.725267 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs2rv\" (UniqueName: \"kubernetes.io/projected/32bc2074-3d53-44b4-8e6c-c500a2617944-kube-api-access-vs2rv\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.725311 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.725354 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bc2074-3d53-44b4-8e6c-c500a2617944-log-httpd\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.725393 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-scripts\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.725424 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bc2074-3d53-44b4-8e6c-c500a2617944-run-httpd\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.725916 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bc2074-3d53-44b4-8e6c-c500a2617944-run-httpd\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.726719 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bc2074-3d53-44b4-8e6c-c500a2617944-log-httpd\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.730762 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.735277 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-config-data\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.735643 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-scripts\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.742586 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.790376 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs2rv\" (UniqueName: \"kubernetes.io/projected/32bc2074-3d53-44b4-8e6c-c500a2617944-kube-api-access-vs2rv\") pod \"ceilometer-0\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.829816 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:06:38 crc kubenswrapper[4757]: I1216 13:06:38.875053 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-w4tw2"] Dec 16 13:06:39 crc kubenswrapper[4757]: I1216 13:06:39.226142 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w4tw2" event={"ID":"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8","Type":"ContainerStarted","Data":"73dde4b86e8bc291a90c8050a25f291ba10ea1f0dedf6ba25ff7df190ce74603"} Dec 16 13:06:39 crc kubenswrapper[4757]: I1216 13:06:39.247572 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77668464d5-b2zbm"] Dec 16 13:06:39 crc kubenswrapper[4757]: I1216 13:06:39.260221 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8d6465865-4lfwj"] Dec 16 13:06:39 crc kubenswrapper[4757]: I1216 13:06:39.268830 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-4s69s"] Dec 16 13:06:39 crc kubenswrapper[4757]: I1216 13:06:39.301316 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-7lhwd"] Dec 16 13:06:39 crc kubenswrapper[4757]: W1216 13:06:39.339962 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode48bb858_bd6e_4dbc_a17a_5fd5e1275e00.slice/crio-cf72460b542b71a19f17bbda38b33db4f696dd3995f9ac76e1fdfac04b32a377 WatchSource:0}: Error finding container cf72460b542b71a19f17bbda38b33db4f696dd3995f9ac76e1fdfac04b32a377: Status 404 returned error can't find the container with id cf72460b542b71a19f17bbda38b33db4f696dd3995f9ac76e1fdfac04b32a377 Dec 16 13:06:39 crc kubenswrapper[4757]: I1216 13:06:39.386740 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-86cds"] Dec 16 13:06:39 crc kubenswrapper[4757]: W1216 13:06:39.388819 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25d57428_c378_4e57_87c5_f1fff2398cec.slice/crio-b2637d8efcd2a610a4942f335c526c5c2a2391b7f76e50fb356f94e260d09867 WatchSource:0}: Error finding container b2637d8efcd2a610a4942f335c526c5c2a2391b7f76e50fb356f94e260d09867: Status 404 returned error can't find the container with id b2637d8efcd2a610a4942f335c526c5c2a2391b7f76e50fb356f94e260d09867 Dec 16 13:06:39 crc kubenswrapper[4757]: I1216 13:06:39.551429 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-pcl59"] Dec 16 13:06:39 crc kubenswrapper[4757]: I1216 13:06:39.645994 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-s8xq7"] Dec 16 13:06:39 crc kubenswrapper[4757]: I1216 13:06:39.785499 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tjbz5"] Dec 16 13:06:39 crc kubenswrapper[4757]: I1216 13:06:39.796534 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.164238 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-54js6" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.212995 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe9cebdb-26c9-4618-9640-5e17d5976d12-config-data\") pod \"fe9cebdb-26c9-4618-9640-5e17d5976d12\" (UID: \"fe9cebdb-26c9-4618-9640-5e17d5976d12\") " Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.213455 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe9cebdb-26c9-4618-9640-5e17d5976d12-db-sync-config-data\") pod \"fe9cebdb-26c9-4618-9640-5e17d5976d12\" (UID: \"fe9cebdb-26c9-4618-9640-5e17d5976d12\") " Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.213511 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftzw4\" (UniqueName: \"kubernetes.io/projected/fe9cebdb-26c9-4618-9640-5e17d5976d12-kube-api-access-ftzw4\") pod \"fe9cebdb-26c9-4618-9640-5e17d5976d12\" (UID: \"fe9cebdb-26c9-4618-9640-5e17d5976d12\") " Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.213716 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe9cebdb-26c9-4618-9640-5e17d5976d12-combined-ca-bundle\") pod \"fe9cebdb-26c9-4618-9640-5e17d5976d12\" (UID: \"fe9cebdb-26c9-4618-9640-5e17d5976d12\") " Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.240429 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe9cebdb-26c9-4618-9640-5e17d5976d12-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fe9cebdb-26c9-4618-9640-5e17d5976d12" (UID: "fe9cebdb-26c9-4618-9640-5e17d5976d12"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.242635 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe9cebdb-26c9-4618-9640-5e17d5976d12-kube-api-access-ftzw4" (OuterVolumeSpecName: "kube-api-access-ftzw4") pod "fe9cebdb-26c9-4618-9640-5e17d5976d12" (UID: "fe9cebdb-26c9-4618-9640-5e17d5976d12"). InnerVolumeSpecName "kube-api-access-ftzw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.304728 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w4tw2" event={"ID":"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8","Type":"ContainerStarted","Data":"fef3b9ba90c33271b530e822a9cf915b454fffe64618a682d8d11e0a8e188cd3"} Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.323890 4757 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe9cebdb-26c9-4618-9640-5e17d5976d12-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.323951 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftzw4\" (UniqueName: \"kubernetes.io/projected/fe9cebdb-26c9-4618-9640-5e17d5976d12-kube-api-access-ftzw4\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.334135 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77668464d5-b2zbm" event={"ID":"73702eed-2df7-4b2a-8bc5-04e038078d7b","Type":"ContainerStarted","Data":"adff2b2e06e0d4c8f9ade06fbbb9cfb6ff1ec52d7dd3d537bf139866c10c8500"} Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.344525 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8d6465865-4lfwj" event={"ID":"aa69e580-99e4-454b-9fa8-906d0410aea5","Type":"ContainerStarted","Data":"05078933b163c5ea0454940374b93dad47813396739081db34021c3a30ad5a9f"} Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.344943 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe9cebdb-26c9-4618-9640-5e17d5976d12-config-data" (OuterVolumeSpecName: "config-data") pod "fe9cebdb-26c9-4618-9640-5e17d5976d12" (UID: "fe9cebdb-26c9-4618-9640-5e17d5976d12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.349405 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-w4tw2" podStartSLOduration=3.349379693 podStartE2EDuration="3.349379693s" podCreationTimestamp="2025-12-16 13:06:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:06:40.334470402 +0000 UTC m=+1185.762214208" watchObservedRunningTime="2025-12-16 13:06:40.349379693 +0000 UTC m=+1185.777123489" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.361920 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe9cebdb-26c9-4618-9640-5e17d5976d12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe9cebdb-26c9-4618-9640-5e17d5976d12" (UID: "fe9cebdb-26c9-4618-9640-5e17d5976d12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.362731 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-54js6" event={"ID":"fe9cebdb-26c9-4618-9640-5e17d5976d12","Type":"ContainerDied","Data":"d6ab5147ef6c1966a23300c72a184e52c4cf767e373738ba8d03d435856cb567"} Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.362758 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6ab5147ef6c1966a23300c72a184e52c4cf767e373738ba8d03d435856cb567" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.362821 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-54js6" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.372861 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tjbz5" event={"ID":"362eaecb-4139-44f9-a651-3e14cc2d6ae2","Type":"ContainerStarted","Data":"35b8c3889adb1686b338f16a57cc08b1c2746bf3764724b5fbca08bd0b93b185"} Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.374811 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7lhwd" event={"ID":"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00","Type":"ContainerStarted","Data":"cf72460b542b71a19f17bbda38b33db4f696dd3995f9ac76e1fdfac04b32a377"} Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.376795 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32bc2074-3d53-44b4-8e6c-c500a2617944","Type":"ContainerStarted","Data":"e1e423eeac2f066df36b8340d33357cb2f2d56161cc7f3cd572a6cef88dec90e"} Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.385308 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pcl59" event={"ID":"fc9a7054-7c7f-4e36-8d57-e095087a7878","Type":"ContainerStarted","Data":"e432176b39a460d47e7d77399b0f4c007df8990074a5a85f35961c0774cacecc"} Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.385368 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pcl59" event={"ID":"fc9a7054-7c7f-4e36-8d57-e095087a7878","Type":"ContainerStarted","Data":"2c2d9ed917162169e453b25dfddd2dc9e28edcbdc6eeb816a63b02fa65224627"} Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.416794 4757 generic.go:334] "Generic (PLEG): container finished" podID="b62c28d5-0dfc-42c2-9e47-3c81b68e8d05" containerID="f81444edfe8abf118b19cd7163cb836efcc7033cfaf63cc8b3588bb53ac29a95" exitCode=0 Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.417026 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-4s69s" event={"ID":"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05","Type":"ContainerDied","Data":"f81444edfe8abf118b19cd7163cb836efcc7033cfaf63cc8b3588bb53ac29a95"} Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.417049 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-4s69s" event={"ID":"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05","Type":"ContainerStarted","Data":"c32fa5705fec32ce0ff5c33daf8e0a927df573b6d699c6b145d5dbbc1436589d"} Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.424188 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-86cds" event={"ID":"25d57428-c378-4e57-87c5-f1fff2398cec","Type":"ContainerStarted","Data":"b2637d8efcd2a610a4942f335c526c5c2a2391b7f76e50fb356f94e260d09867"} Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.426766 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe9cebdb-26c9-4618-9640-5e17d5976d12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.426818 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe9cebdb-26c9-4618-9640-5e17d5976d12-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.438901 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-pcl59" podStartSLOduration=2.438881663 podStartE2EDuration="2.438881663s" podCreationTimestamp="2025-12-16 13:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:06:40.409648133 +0000 UTC m=+1185.837391929" watchObservedRunningTime="2025-12-16 13:06:40.438881663 +0000 UTC m=+1185.866625459" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.442878 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" event={"ID":"7ba28476-9739-461a-858c-9144c2470948","Type":"ContainerStarted","Data":"37f6bd7ecae68ae0b0bd81a0dc07e844a550d084873e61a81b135ceb7c81df8a"} Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.442913 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" event={"ID":"7ba28476-9739-461a-858c-9144c2470948","Type":"ContainerStarted","Data":"f502fa5d46bb38bf04008c2861899ea7fd22383475dc773c83cf10ad0b601792"} Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.893195 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77668464d5-b2zbm"] Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.950110 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5b9cdf76cf-dw2p5"] Dec 16 13:06:41 crc kubenswrapper[4757]: E1216 13:06:40.950489 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe9cebdb-26c9-4618-9640-5e17d5976d12" containerName="glance-db-sync" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.950500 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe9cebdb-26c9-4618-9640-5e17d5976d12" containerName="glance-db-sync" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.950676 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe9cebdb-26c9-4618-9640-5e17d5976d12" containerName="glance-db-sync" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:40.953270 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.004291 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b9cdf76cf-dw2p5"] Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.039051 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-config-data\") pod \"horizon-5b9cdf76cf-dw2p5\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.039132 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-logs\") pod \"horizon-5b9cdf76cf-dw2p5\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.039184 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-scripts\") pod \"horizon-5b9cdf76cf-dw2p5\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.039294 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncncm\" (UniqueName: \"kubernetes.io/projected/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-kube-api-access-ncncm\") pod \"horizon-5b9cdf76cf-dw2p5\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.039356 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-horizon-secret-key\") pod \"horizon-5b9cdf76cf-dw2p5\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.140772 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-logs\") pod \"horizon-5b9cdf76cf-dw2p5\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.141197 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-scripts\") pod \"horizon-5b9cdf76cf-dw2p5\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.141307 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncncm\" (UniqueName: \"kubernetes.io/projected/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-kube-api-access-ncncm\") pod \"horizon-5b9cdf76cf-dw2p5\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.141393 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-horizon-secret-key\") pod \"horizon-5b9cdf76cf-dw2p5\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.141426 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-config-data\") pod \"horizon-5b9cdf76cf-dw2p5\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.141424 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-logs\") pod \"horizon-5b9cdf76cf-dw2p5\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.143031 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-config-data\") pod \"horizon-5b9cdf76cf-dw2p5\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.143819 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-scripts\") pod \"horizon-5b9cdf76cf-dw2p5\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.177411 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-horizon-secret-key\") pod \"horizon-5b9cdf76cf-dw2p5\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.181219 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncncm\" (UniqueName: \"kubernetes.io/projected/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-kube-api-access-ncncm\") pod \"horizon-5b9cdf76cf-dw2p5\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.284937 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.410391 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.507268 4757 generic.go:334] "Generic (PLEG): container finished" podID="7ba28476-9739-461a-858c-9144c2470948" containerID="37f6bd7ecae68ae0b0bd81a0dc07e844a550d084873e61a81b135ceb7c81df8a" exitCode=0 Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.508174 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" event={"ID":"7ba28476-9739-461a-858c-9144c2470948","Type":"ContainerDied","Data":"37f6bd7ecae68ae0b0bd81a0dc07e844a550d084873e61a81b135ceb7c81df8a"} Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.695180 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-s8xq7"] Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.739426 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-s2sxx"] Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.750415 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.765583 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-s2sxx"] Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.894562 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgnhh\" (UniqueName: \"kubernetes.io/projected/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-kube-api-access-cgnhh\") pod \"dnsmasq-dns-56df8fb6b7-s2sxx\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.894899 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-s2sxx\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.894935 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-s2sxx\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.894968 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-s2sxx\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.894985 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-s2sxx\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:41 crc kubenswrapper[4757]: I1216 13:06:41.895071 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-config\") pod \"dnsmasq-dns-56df8fb6b7-s2sxx\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:41.996503 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-config\") pod \"dnsmasq-dns-56df8fb6b7-s2sxx\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:41.996600 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgnhh\" (UniqueName: \"kubernetes.io/projected/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-kube-api-access-cgnhh\") pod \"dnsmasq-dns-56df8fb6b7-s2sxx\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:41.996638 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-s2sxx\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:41.996676 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-s2sxx\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:41.996718 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-s2sxx\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:41.996740 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-s2sxx\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:41.997808 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-s2sxx\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:41.998505 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-config\") pod \"dnsmasq-dns-56df8fb6b7-s2sxx\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:41.999839 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-s2sxx\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.001413 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-s2sxx\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.001668 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-s2sxx\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.043506 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgnhh\" (UniqueName: \"kubernetes.io/projected/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-kube-api-access-cgnhh\") pod \"dnsmasq-dns-56df8fb6b7-s2sxx\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.109808 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.412839 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.487491 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 13:06:42 crc kubenswrapper[4757]: E1216 13:06:42.493603 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b62c28d5-0dfc-42c2-9e47-3c81b68e8d05" containerName="init" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.493826 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b62c28d5-0dfc-42c2-9e47-3c81b68e8d05" containerName="init" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.500183 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="b62c28d5-0dfc-42c2-9e47-3c81b68e8d05" containerName="init" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.507303 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.510830 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.515793 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.519316 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-24n8w" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.592700 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-dns-swift-storage-0\") pod \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.593265 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-ovsdbserver-sb\") pod \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.593358 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-dns-svc\") pod \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.593376 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-config\") pod \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.593409 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sjcn\" (UniqueName: \"kubernetes.io/projected/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-kube-api-access-8sjcn\") pod \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.593423 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-ovsdbserver-nb\") pod \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\" (UID: \"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05\") " Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.609914 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.651643 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" event={"ID":"7ba28476-9739-461a-858c-9144c2470948","Type":"ContainerStarted","Data":"e8d5bb4f19ba0d7a0d7b006a0096fa8d83444821441997a2be6b805ae1d485a9"} Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.651796 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" podUID="7ba28476-9739-461a-858c-9144c2470948" containerName="dnsmasq-dns" containerID="cri-o://e8d5bb4f19ba0d7a0d7b006a0096fa8d83444821441997a2be6b805ae1d485a9" gracePeriod=10 Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.652091 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.674161 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-4s69s" event={"ID":"b62c28d5-0dfc-42c2-9e47-3c81b68e8d05","Type":"ContainerDied","Data":"c32fa5705fec32ce0ff5c33daf8e0a927df573b6d699c6b145d5dbbc1436589d"} Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.674395 4757 scope.go:117] "RemoveContainer" containerID="f81444edfe8abf118b19cd7163cb836efcc7033cfaf63cc8b3588bb53ac29a95" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.674575 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-4s69s" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.702545 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1cc3bc2-c644-4732-a745-2c63515caf83-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.702605 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1cc3bc2-c644-4732-a745-2c63515caf83-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.702654 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1cc3bc2-c644-4732-a745-2c63515caf83-scripts\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.702698 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mldc\" (UniqueName: \"kubernetes.io/projected/a1cc3bc2-c644-4732-a745-2c63515caf83-kube-api-access-7mldc\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.702729 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1cc3bc2-c644-4732-a745-2c63515caf83-config-data\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.702762 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1cc3bc2-c644-4732-a745-2c63515caf83-logs\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.702780 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.745804 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b62c28d5-0dfc-42c2-9e47-3c81b68e8d05" (UID: "b62c28d5-0dfc-42c2-9e47-3c81b68e8d05"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.751960 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b9cdf76cf-dw2p5"] Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.757477 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-kube-api-access-8sjcn" (OuterVolumeSpecName: "kube-api-access-8sjcn") pod "b62c28d5-0dfc-42c2-9e47-3c81b68e8d05" (UID: "b62c28d5-0dfc-42c2-9e47-3c81b68e8d05"). InnerVolumeSpecName "kube-api-access-8sjcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.762110 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b62c28d5-0dfc-42c2-9e47-3c81b68e8d05" (UID: "b62c28d5-0dfc-42c2-9e47-3c81b68e8d05"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.782284 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" podStartSLOduration=4.782246814 podStartE2EDuration="4.782246814s" podCreationTimestamp="2025-12-16 13:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:06:42.740300191 +0000 UTC m=+1188.168043997" watchObservedRunningTime="2025-12-16 13:06:42.782246814 +0000 UTC m=+1188.209990610" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.791892 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b62c28d5-0dfc-42c2-9e47-3c81b68e8d05" (UID: "b62c28d5-0dfc-42c2-9e47-3c81b68e8d05"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.792453 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-config" (OuterVolumeSpecName: "config") pod "b62c28d5-0dfc-42c2-9e47-3c81b68e8d05" (UID: "b62c28d5-0dfc-42c2-9e47-3c81b68e8d05"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.801228 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b62c28d5-0dfc-42c2-9e47-3c81b68e8d05" (UID: "b62c28d5-0dfc-42c2-9e47-3c81b68e8d05"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.807094 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1cc3bc2-c644-4732-a745-2c63515caf83-logs\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.807148 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.807200 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1cc3bc2-c644-4732-a745-2c63515caf83-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.807355 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1cc3bc2-c644-4732-a745-2c63515caf83-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.807479 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1cc3bc2-c644-4732-a745-2c63515caf83-scripts\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.807594 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mldc\" (UniqueName: \"kubernetes.io/projected/a1cc3bc2-c644-4732-a745-2c63515caf83-kube-api-access-7mldc\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.807656 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1cc3bc2-c644-4732-a745-2c63515caf83-config-data\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.807777 4757 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.807792 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.807867 4757 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.807882 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.807893 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.807905 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sjcn\" (UniqueName: \"kubernetes.io/projected/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05-kube-api-access-8sjcn\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.814612 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1cc3bc2-c644-4732-a745-2c63515caf83-logs\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.815988 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1cc3bc2-c644-4732-a745-2c63515caf83-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.826621 4757 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.835920 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1cc3bc2-c644-4732-a745-2c63515caf83-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.837965 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1cc3bc2-c644-4732-a745-2c63515caf83-config-data\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.843679 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1cc3bc2-c644-4732-a745-2c63515caf83-scripts\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.865830 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mldc\" (UniqueName: \"kubernetes.io/projected/a1cc3bc2-c644-4732-a745-2c63515caf83-kube-api-access-7mldc\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.912424 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " pod="openstack/glance-default-external-api-0" Dec 16 13:06:42 crc kubenswrapper[4757]: I1216 13:06:42.925371 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.083913 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-4s69s"] Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.102426 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-4s69s"] Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.161218 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.164578 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.170587 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.198425 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-s2sxx"] Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.222464 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.231601 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.231693 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98dac12-1368-4c66-9456-0e364ca153d7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.231726 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c98dac12-1368-4c66-9456-0e364ca153d7-logs\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.231747 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c98dac12-1368-4c66-9456-0e364ca153d7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.231778 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ncnx\" (UniqueName: \"kubernetes.io/projected/c98dac12-1368-4c66-9456-0e364ca153d7-kube-api-access-8ncnx\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.231824 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98dac12-1368-4c66-9456-0e364ca153d7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.231842 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c98dac12-1368-4c66-9456-0e364ca153d7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.335239 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.335597 4757 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.335619 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98dac12-1368-4c66-9456-0e364ca153d7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.335639 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c98dac12-1368-4c66-9456-0e364ca153d7-logs\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.335657 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c98dac12-1368-4c66-9456-0e364ca153d7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.335694 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ncnx\" (UniqueName: \"kubernetes.io/projected/c98dac12-1368-4c66-9456-0e364ca153d7-kube-api-access-8ncnx\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.335748 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98dac12-1368-4c66-9456-0e364ca153d7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.335764 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c98dac12-1368-4c66-9456-0e364ca153d7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.337875 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c98dac12-1368-4c66-9456-0e364ca153d7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.345899 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c98dac12-1368-4c66-9456-0e364ca153d7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.347808 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c98dac12-1368-4c66-9456-0e364ca153d7-logs\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.363997 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98dac12-1368-4c66-9456-0e364ca153d7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.373544 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98dac12-1368-4c66-9456-0e364ca153d7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.375771 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ncnx\" (UniqueName: \"kubernetes.io/projected/c98dac12-1368-4c66-9456-0e364ca153d7-kube-api-access-8ncnx\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.390558 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.507683 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.739222 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" event={"ID":"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee","Type":"ContainerStarted","Data":"a19d2e5427d2d6b4409ce88916f6139b60e936a6207597a096d0cdb246ffea98"} Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.753348 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b9cdf76cf-dw2p5" event={"ID":"9f6a7cd0-596f-4058-9bde-db8c55aca8c0","Type":"ContainerStarted","Data":"912054bc8f5e30deea7c554b44f2eab745153f47665f9f043cf8f66f4b340334"} Dec 16 13:06:43 crc kubenswrapper[4757]: I1216 13:06:43.892997 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 13:06:44 crc kubenswrapper[4757]: I1216 13:06:44.304786 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 13:06:44 crc kubenswrapper[4757]: I1216 13:06:44.777464 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" event={"ID":"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee","Type":"ContainerStarted","Data":"d4f19d898e28c9d3c23a326aa417713f08fd2b11f12443eae761937505148f1a"} Dec 16 13:06:44 crc kubenswrapper[4757]: I1216 13:06:44.788317 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a1cc3bc2-c644-4732-a745-2c63515caf83","Type":"ContainerStarted","Data":"7bae95c1593a3ccb3a00af7875561b3f66bc37eee0f79e026d090848958b0ac7"} Dec 16 13:06:44 crc kubenswrapper[4757]: I1216 13:06:44.792083 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c98dac12-1368-4c66-9456-0e364ca153d7","Type":"ContainerStarted","Data":"f852324ad0486f9cd6d82e47c132aab181a0e53a5dc6b046a01b9595adc60faf"} Dec 16 13:06:44 crc kubenswrapper[4757]: I1216 13:06:44.801309 4757 generic.go:334] "Generic (PLEG): container finished" podID="7ba28476-9739-461a-858c-9144c2470948" containerID="e8d5bb4f19ba0d7a0d7b006a0096fa8d83444821441997a2be6b805ae1d485a9" exitCode=0 Dec 16 13:06:44 crc kubenswrapper[4757]: I1216 13:06:44.801367 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" event={"ID":"7ba28476-9739-461a-858c-9144c2470948","Type":"ContainerDied","Data":"e8d5bb4f19ba0d7a0d7b006a0096fa8d83444821441997a2be6b805ae1d485a9"} Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.015187 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b62c28d5-0dfc-42c2-9e47-3c81b68e8d05" path="/var/lib/kubelet/pods/b62c28d5-0dfc-42c2-9e47-3c81b68e8d05/volumes" Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.614132 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.742477 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-dns-swift-storage-0\") pod \"7ba28476-9739-461a-858c-9144c2470948\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.743505 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-ovsdbserver-sb\") pod \"7ba28476-9739-461a-858c-9144c2470948\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.743579 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-ovsdbserver-nb\") pod \"7ba28476-9739-461a-858c-9144c2470948\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.743609 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-dns-svc\") pod \"7ba28476-9739-461a-858c-9144c2470948\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.743744 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-config\") pod \"7ba28476-9739-461a-858c-9144c2470948\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.743842 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgv5c\" (UniqueName: \"kubernetes.io/projected/7ba28476-9739-461a-858c-9144c2470948-kube-api-access-cgv5c\") pod \"7ba28476-9739-461a-858c-9144c2470948\" (UID: \"7ba28476-9739-461a-858c-9144c2470948\") " Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.760023 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ba28476-9739-461a-858c-9144c2470948-kube-api-access-cgv5c" (OuterVolumeSpecName: "kube-api-access-cgv5c") pod "7ba28476-9739-461a-858c-9144c2470948" (UID: "7ba28476-9739-461a-858c-9144c2470948"). InnerVolumeSpecName "kube-api-access-cgv5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.810152 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7ba28476-9739-461a-858c-9144c2470948" (UID: "7ba28476-9739-461a-858c-9144c2470948"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.821098 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7ba28476-9739-461a-858c-9144c2470948" (UID: "7ba28476-9739-461a-858c-9144c2470948"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.829596 4757 generic.go:334] "Generic (PLEG): container finished" podID="d708ede9-7a1e-4baa-9c7a-21bb0007c6ee" containerID="d4f19d898e28c9d3c23a326aa417713f08fd2b11f12443eae761937505148f1a" exitCode=0 Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.829697 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" event={"ID":"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee","Type":"ContainerDied","Data":"d4f19d898e28c9d3c23a326aa417713f08fd2b11f12443eae761937505148f1a"} Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.834028 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7ba28476-9739-461a-858c-9144c2470948" (UID: "7ba28476-9739-461a-858c-9144c2470948"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.839118 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-config" (OuterVolumeSpecName: "config") pod "7ba28476-9739-461a-858c-9144c2470948" (UID: "7ba28476-9739-461a-858c-9144c2470948"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.839976 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7ba28476-9739-461a-858c-9144c2470948" (UID: "7ba28476-9739-461a-858c-9144c2470948"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.840911 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" event={"ID":"7ba28476-9739-461a-858c-9144c2470948","Type":"ContainerDied","Data":"f502fa5d46bb38bf04008c2861899ea7fd22383475dc773c83cf10ad0b601792"} Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.840969 4757 scope.go:117] "RemoveContainer" containerID="e8d5bb4f19ba0d7a0d7b006a0096fa8d83444821441997a2be6b805ae1d485a9" Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.841149 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-s8xq7" Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.847613 4757 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.847784 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.847869 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.847939 4757 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.848234 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba28476-9739-461a-858c-9144c2470948-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:45 crc kubenswrapper[4757]: I1216 13:06:45.848525 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgv5c\" (UniqueName: \"kubernetes.io/projected/7ba28476-9739-461a-858c-9144c2470948-kube-api-access-cgv5c\") on node \"crc\" DevicePath \"\"" Dec 16 13:06:46 crc kubenswrapper[4757]: I1216 13:06:46.003111 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-s8xq7"] Dec 16 13:06:46 crc kubenswrapper[4757]: I1216 13:06:46.008077 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-s8xq7"] Dec 16 13:06:46 crc kubenswrapper[4757]: I1216 13:06:46.019345 4757 scope.go:117] "RemoveContainer" containerID="37f6bd7ecae68ae0b0bd81a0dc07e844a550d084873e61a81b135ceb7c81df8a" Dec 16 13:06:46 crc kubenswrapper[4757]: I1216 13:06:46.858754 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" event={"ID":"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee","Type":"ContainerStarted","Data":"c69fbe380803057a8c809584828b3c2d68ee3a7ccfd071848ee98903f953142e"} Dec 16 13:06:46 crc kubenswrapper[4757]: I1216 13:06:46.861325 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a1cc3bc2-c644-4732-a745-2c63515caf83","Type":"ContainerStarted","Data":"b5ad814ce268cbd19cec50bb8eba771876e5abb2a23e2154408ff781ef72f60f"} Dec 16 13:06:46 crc kubenswrapper[4757]: I1216 13:06:46.863788 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c98dac12-1368-4c66-9456-0e364ca153d7","Type":"ContainerStarted","Data":"e2f940cded57ea9914d909a7a61ce4ba1e6e652a540bee524f68152d3e7c3db2"} Dec 16 13:06:46 crc kubenswrapper[4757]: I1216 13:06:46.964458 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ba28476-9739-461a-858c-9144c2470948" path="/var/lib/kubelet/pods/7ba28476-9739-461a-858c-9144c2470948/volumes" Dec 16 13:06:47 crc kubenswrapper[4757]: I1216 13:06:47.879422 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:47 crc kubenswrapper[4757]: I1216 13:06:47.905808 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" podStartSLOduration=6.9057891300000005 podStartE2EDuration="6.90578913s" podCreationTimestamp="2025-12-16 13:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:06:47.898284384 +0000 UTC m=+1193.326028190" watchObservedRunningTime="2025-12-16 13:06:47.90578913 +0000 UTC m=+1193.333532926" Dec 16 13:06:49 crc kubenswrapper[4757]: I1216 13:06:49.918046 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 13:06:50 crc kubenswrapper[4757]: I1216 13:06:50.017396 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.013819 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8d6465865-4lfwj"] Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.101507 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-75ccc7d896-jmrk9"] Dec 16 13:06:51 crc kubenswrapper[4757]: E1216 13:06:51.102318 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ba28476-9739-461a-858c-9144c2470948" containerName="init" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.102335 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ba28476-9739-461a-858c-9144c2470948" containerName="init" Dec 16 13:06:51 crc kubenswrapper[4757]: E1216 13:06:51.102348 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ba28476-9739-461a-858c-9144c2470948" containerName="dnsmasq-dns" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.102358 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ba28476-9739-461a-858c-9144c2470948" containerName="dnsmasq-dns" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.102609 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ba28476-9739-461a-858c-9144c2470948" containerName="dnsmasq-dns" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.104776 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.127119 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.134848 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75ccc7d896-jmrk9"] Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.183675 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dqnx\" (UniqueName: \"kubernetes.io/projected/399f2693-64b1-4958-ad75-49c45b448ed5-kube-api-access-2dqnx\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.183737 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/399f2693-64b1-4958-ad75-49c45b448ed5-logs\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.183824 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/399f2693-64b1-4958-ad75-49c45b448ed5-horizon-tls-certs\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.183850 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/399f2693-64b1-4958-ad75-49c45b448ed5-scripts\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.183875 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/399f2693-64b1-4958-ad75-49c45b448ed5-horizon-secret-key\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.183896 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/399f2693-64b1-4958-ad75-49c45b448ed5-config-data\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.183918 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399f2693-64b1-4958-ad75-49c45b448ed5-combined-ca-bundle\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.232471 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5b9cdf76cf-dw2p5"] Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.256262 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5d66ddf65b-lmltr"] Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.258227 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.283670 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d66ddf65b-lmltr"] Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.285052 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/399f2693-64b1-4958-ad75-49c45b448ed5-logs\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.285148 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/399f2693-64b1-4958-ad75-49c45b448ed5-horizon-tls-certs\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.285165 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/399f2693-64b1-4958-ad75-49c45b448ed5-scripts\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.285186 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/399f2693-64b1-4958-ad75-49c45b448ed5-horizon-secret-key\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.285205 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/399f2693-64b1-4958-ad75-49c45b448ed5-config-data\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.285222 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399f2693-64b1-4958-ad75-49c45b448ed5-combined-ca-bundle\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.285242 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65337bd1-c674-4817-91c2-ad150639205c-combined-ca-bundle\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.285275 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd978\" (UniqueName: \"kubernetes.io/projected/65337bd1-c674-4817-91c2-ad150639205c-kube-api-access-rd978\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.285304 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65337bd1-c674-4817-91c2-ad150639205c-logs\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.285323 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/65337bd1-c674-4817-91c2-ad150639205c-horizon-tls-certs\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.285346 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dqnx\" (UniqueName: \"kubernetes.io/projected/399f2693-64b1-4958-ad75-49c45b448ed5-kube-api-access-2dqnx\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.285372 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65337bd1-c674-4817-91c2-ad150639205c-config-data\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.285606 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/65337bd1-c674-4817-91c2-ad150639205c-horizon-secret-key\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.285626 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65337bd1-c674-4817-91c2-ad150639205c-scripts\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.286825 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/399f2693-64b1-4958-ad75-49c45b448ed5-config-data\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.287399 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/399f2693-64b1-4958-ad75-49c45b448ed5-scripts\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.289311 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/399f2693-64b1-4958-ad75-49c45b448ed5-logs\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.292898 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/399f2693-64b1-4958-ad75-49c45b448ed5-horizon-secret-key\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.313254 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399f2693-64b1-4958-ad75-49c45b448ed5-combined-ca-bundle\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.331728 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/399f2693-64b1-4958-ad75-49c45b448ed5-horizon-tls-certs\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.345197 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dqnx\" (UniqueName: \"kubernetes.io/projected/399f2693-64b1-4958-ad75-49c45b448ed5-kube-api-access-2dqnx\") pod \"horizon-75ccc7d896-jmrk9\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.387872 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65337bd1-c674-4817-91c2-ad150639205c-combined-ca-bundle\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.387946 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd978\" (UniqueName: \"kubernetes.io/projected/65337bd1-c674-4817-91c2-ad150639205c-kube-api-access-rd978\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.387981 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65337bd1-c674-4817-91c2-ad150639205c-logs\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.388029 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/65337bd1-c674-4817-91c2-ad150639205c-horizon-tls-certs\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.388060 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65337bd1-c674-4817-91c2-ad150639205c-config-data\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.388081 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/65337bd1-c674-4817-91c2-ad150639205c-horizon-secret-key\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.388101 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65337bd1-c674-4817-91c2-ad150639205c-scripts\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.388487 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65337bd1-c674-4817-91c2-ad150639205c-logs\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.389137 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65337bd1-c674-4817-91c2-ad150639205c-scripts\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.391877 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65337bd1-c674-4817-91c2-ad150639205c-config-data\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.395565 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/65337bd1-c674-4817-91c2-ad150639205c-horizon-secret-key\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.395744 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/65337bd1-c674-4817-91c2-ad150639205c-horizon-tls-certs\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.397990 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65337bd1-c674-4817-91c2-ad150639205c-combined-ca-bundle\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.408114 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd978\" (UniqueName: \"kubernetes.io/projected/65337bd1-c674-4817-91c2-ad150639205c-kube-api-access-rd978\") pod \"horizon-5d66ddf65b-lmltr\" (UID: \"65337bd1-c674-4817-91c2-ad150639205c\") " pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.466838 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:06:51 crc kubenswrapper[4757]: I1216 13:06:51.580996 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:06:52 crc kubenswrapper[4757]: I1216 13:06:52.112172 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:06:52 crc kubenswrapper[4757]: I1216 13:06:52.176547 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-4rpn8"] Dec 16 13:06:52 crc kubenswrapper[4757]: I1216 13:06:52.176817 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" podUID="8efffcc7-626b-47bf-aa44-c44a74763f9a" containerName="dnsmasq-dns" containerID="cri-o://4b0e7dbca92560f00211cde1b539ef23776d5e423469c331c705743192848902" gracePeriod=10 Dec 16 13:06:52 crc kubenswrapper[4757]: I1216 13:06:52.716028 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" podUID="8efffcc7-626b-47bf-aa44-c44a74763f9a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Dec 16 13:06:52 crc kubenswrapper[4757]: I1216 13:06:52.933525 4757 generic.go:334] "Generic (PLEG): container finished" podID="8efffcc7-626b-47bf-aa44-c44a74763f9a" containerID="4b0e7dbca92560f00211cde1b539ef23776d5e423469c331c705743192848902" exitCode=0 Dec 16 13:06:52 crc kubenswrapper[4757]: I1216 13:06:52.933604 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" event={"ID":"8efffcc7-626b-47bf-aa44-c44a74763f9a","Type":"ContainerDied","Data":"4b0e7dbca92560f00211cde1b539ef23776d5e423469c331c705743192848902"} Dec 16 13:06:57 crc kubenswrapper[4757]: E1216 13:06:57.270090 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Dec 16 13:06:57 crc kubenswrapper[4757]: E1216 13:06:57.270732 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-25lt5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-86cds_openstack(25d57428-c378-4e57-87c5-f1fff2398cec): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:06:57 crc kubenswrapper[4757]: E1216 13:06:57.271939 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-86cds" podUID="25d57428-c378-4e57-87c5-f1fff2398cec" Dec 16 13:06:57 crc kubenswrapper[4757]: I1216 13:06:57.715169 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" podUID="8efffcc7-626b-47bf-aa44-c44a74763f9a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Dec 16 13:06:57 crc kubenswrapper[4757]: I1216 13:06:57.976394 4757 generic.go:334] "Generic (PLEG): container finished" podID="b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8" containerID="fef3b9ba90c33271b530e822a9cf915b454fffe64618a682d8d11e0a8e188cd3" exitCode=0 Dec 16 13:06:57 crc kubenswrapper[4757]: I1216 13:06:57.978059 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w4tw2" event={"ID":"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8","Type":"ContainerDied","Data":"fef3b9ba90c33271b530e822a9cf915b454fffe64618a682d8d11e0a8e188cd3"} Dec 16 13:06:57 crc kubenswrapper[4757]: E1216 13:06:57.983820 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-86cds" podUID="25d57428-c378-4e57-87c5-f1fff2398cec" Dec 16 13:07:02 crc kubenswrapper[4757]: I1216 13:07:02.714803 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" podUID="8efffcc7-626b-47bf-aa44-c44a74763f9a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Dec 16 13:07:02 crc kubenswrapper[4757]: I1216 13:07:02.715905 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:07:04 crc kubenswrapper[4757]: E1216 13:07:04.360171 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 16 13:07:04 crc kubenswrapper[4757]: E1216 13:07:04.361256 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nbch55fh549h54hf6h58dh58fhcfh98h66hd8h5b5h664h9ch96h5d9h569h4h56dh656h5bbh64chcch684hb9h568h654h548h64ch565hd5h58q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ncncm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5b9cdf76cf-dw2p5_openstack(9f6a7cd0-596f-4058-9bde-db8c55aca8c0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:07:04 crc kubenswrapper[4757]: E1216 13:07:04.366287 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5b9cdf76cf-dw2p5" podUID="9f6a7cd0-596f-4058-9bde-db8c55aca8c0" Dec 16 13:07:04 crc kubenswrapper[4757]: E1216 13:07:04.379520 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 16 13:07:04 crc kubenswrapper[4757]: E1216 13:07:04.379695 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n57h578hcdh566hbdh594h69h87h5d7h569h568h6fh5fh5dh7dh5b4h9dh55fh5b9h695h544h567h5bdh8fhc6h66fhddh59fh59ch68fh5f7h676q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cfvdn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-77668464d5-b2zbm_openstack(73702eed-2df7-4b2a-8bc5-04e038078d7b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:07:04 crc kubenswrapper[4757]: E1216 13:07:04.383201 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-77668464d5-b2zbm" podUID="73702eed-2df7-4b2a-8bc5-04e038078d7b" Dec 16 13:07:10 crc kubenswrapper[4757]: E1216 13:07:10.354028 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 16 13:07:10 crc kubenswrapper[4757]: E1216 13:07:10.354580 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n89h5b9h66dh99h676h58ch5bh6h666h5dbhdh9bh68h58h595h595h6fh6dh648hd6h66bh5bdh5bbh78h68ch695h646hffh56hddh6bh99q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-skgsj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-8d6465865-4lfwj_openstack(aa69e580-99e4-454b-9fa8-906d0410aea5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:07:10 crc kubenswrapper[4757]: E1216 13:07:10.368389 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-8d6465865-4lfwj" podUID="aa69e580-99e4-454b-9fa8-906d0410aea5" Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.463142 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.541436 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-fernet-keys\") pod \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.541587 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tm4z\" (UniqueName: \"kubernetes.io/projected/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-kube-api-access-7tm4z\") pod \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.541651 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-config-data\") pod \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.541706 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-scripts\") pod \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.541744 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-combined-ca-bundle\") pod \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.541799 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-credential-keys\") pod \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\" (UID: \"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8\") " Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.552649 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8" (UID: "b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.557390 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8" (UID: "b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.559936 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-scripts" (OuterVolumeSpecName: "scripts") pod "b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8" (UID: "b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.563262 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-kube-api-access-7tm4z" (OuterVolumeSpecName: "kube-api-access-7tm4z") pod "b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8" (UID: "b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8"). InnerVolumeSpecName "kube-api-access-7tm4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.575188 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8" (UID: "b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.602426 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-config-data" (OuterVolumeSpecName: "config-data") pod "b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8" (UID: "b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.644316 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.644606 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.644682 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.644772 4757 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.644868 4757 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:10 crc kubenswrapper[4757]: I1216 13:07:10.644933 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tm4z\" (UniqueName: \"kubernetes.io/projected/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8-kube-api-access-7tm4z\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.159837 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w4tw2" event={"ID":"b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8","Type":"ContainerDied","Data":"73dde4b86e8bc291a90c8050a25f291ba10ea1f0dedf6ba25ff7df190ce74603"} Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.160136 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73dde4b86e8bc291a90c8050a25f291ba10ea1f0dedf6ba25ff7df190ce74603" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.159877 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w4tw2" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.554572 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-w4tw2"] Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.561177 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-w4tw2"] Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.653083 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kg598"] Dec 16 13:07:11 crc kubenswrapper[4757]: E1216 13:07:11.653480 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8" containerName="keystone-bootstrap" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.653500 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8" containerName="keystone-bootstrap" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.654111 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8" containerName="keystone-bootstrap" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.666262 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kg598"] Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.666384 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.672522 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.672598 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.672726 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.672861 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rtqbc" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.672993 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.763947 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-scripts\") pod \"keystone-bootstrap-kg598\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.764000 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-fernet-keys\") pod \"keystone-bootstrap-kg598\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.764213 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-credential-keys\") pod \"keystone-bootstrap-kg598\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.764236 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-config-data\") pod \"keystone-bootstrap-kg598\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.764264 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2tkh\" (UniqueName: \"kubernetes.io/projected/082914b4-7f60-4d23-98ec-51f3c8a831aa-kube-api-access-c2tkh\") pod \"keystone-bootstrap-kg598\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.764334 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-combined-ca-bundle\") pod \"keystone-bootstrap-kg598\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.865365 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-combined-ca-bundle\") pod \"keystone-bootstrap-kg598\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.865677 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-scripts\") pod \"keystone-bootstrap-kg598\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.865706 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-fernet-keys\") pod \"keystone-bootstrap-kg598\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.865775 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-credential-keys\") pod \"keystone-bootstrap-kg598\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.865796 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-config-data\") pod \"keystone-bootstrap-kg598\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.865824 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2tkh\" (UniqueName: \"kubernetes.io/projected/082914b4-7f60-4d23-98ec-51f3c8a831aa-kube-api-access-c2tkh\") pod \"keystone-bootstrap-kg598\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.872136 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-scripts\") pod \"keystone-bootstrap-kg598\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.872643 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-config-data\") pod \"keystone-bootstrap-kg598\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.873892 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-credential-keys\") pod \"keystone-bootstrap-kg598\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.876580 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-fernet-keys\") pod \"keystone-bootstrap-kg598\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.877656 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-combined-ca-bundle\") pod \"keystone-bootstrap-kg598\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.883128 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2tkh\" (UniqueName: \"kubernetes.io/projected/082914b4-7f60-4d23-98ec-51f3c8a831aa-kube-api-access-c2tkh\") pod \"keystone-bootstrap-kg598\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:11 crc kubenswrapper[4757]: I1216 13:07:11.997088 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:12 crc kubenswrapper[4757]: I1216 13:07:12.715556 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" podUID="8efffcc7-626b-47bf-aa44-c44a74763f9a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Dec 16 13:07:12 crc kubenswrapper[4757]: I1216 13:07:12.959107 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8" path="/var/lib/kubelet/pods/b93a5bc5-13ed-4a22-baa3-8f3ca5b3b7e8/volumes" Dec 16 13:07:17 crc kubenswrapper[4757]: I1216 13:07:17.716477 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" podUID="8efffcc7-626b-47bf-aa44-c44a74763f9a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.181963 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.182358 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.212812 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.270685 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.310075 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.325806 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b9cdf76cf-dw2p5" event={"ID":"9f6a7cd0-596f-4058-9bde-db8c55aca8c0","Type":"ContainerDied","Data":"912054bc8f5e30deea7c554b44f2eab745153f47665f9f043cf8f66f4b340334"} Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.325880 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b9cdf76cf-dw2p5" Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.329493 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77668464d5-b2zbm" event={"ID":"73702eed-2df7-4b2a-8bc5-04e038078d7b","Type":"ContainerDied","Data":"adff2b2e06e0d4c8f9ade06fbbb9cfb6ff1ec52d7dd3d537bf139866c10c8500"} Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.329675 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77668464d5-b2zbm" Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.340017 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" event={"ID":"8efffcc7-626b-47bf-aa44-c44a74763f9a","Type":"ContainerDied","Data":"bf4a601cf4c61b45c9ef44c1dc5c22762ff4cfec80f282e3db80ad0e629e63c9"} Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.340541 4757 scope.go:117] "RemoveContainer" containerID="4b0e7dbca92560f00211cde1b539ef23776d5e423469c331c705743192848902" Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.340139 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.372316 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-ovsdbserver-nb\") pod \"8efffcc7-626b-47bf-aa44-c44a74763f9a\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.372387 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-ovsdbserver-sb\") pod \"8efffcc7-626b-47bf-aa44-c44a74763f9a\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.372499 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73702eed-2df7-4b2a-8bc5-04e038078d7b-config-data\") pod \"73702eed-2df7-4b2a-8bc5-04e038078d7b\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.372525 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpnpb\" (UniqueName: \"kubernetes.io/projected/8efffcc7-626b-47bf-aa44-c44a74763f9a-kube-api-access-gpnpb\") pod \"8efffcc7-626b-47bf-aa44-c44a74763f9a\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.372543 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-dns-swift-storage-0\") pod \"8efffcc7-626b-47bf-aa44-c44a74763f9a\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.373057 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-config\") pod \"8efffcc7-626b-47bf-aa44-c44a74763f9a\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.373125 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-dns-svc\") pod \"8efffcc7-626b-47bf-aa44-c44a74763f9a\" (UID: \"8efffcc7-626b-47bf-aa44-c44a74763f9a\") " Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.373151 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfvdn\" (UniqueName: \"kubernetes.io/projected/73702eed-2df7-4b2a-8bc5-04e038078d7b-kube-api-access-cfvdn\") pod \"73702eed-2df7-4b2a-8bc5-04e038078d7b\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.373193 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73702eed-2df7-4b2a-8bc5-04e038078d7b-logs\") pod \"73702eed-2df7-4b2a-8bc5-04e038078d7b\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.373214 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73702eed-2df7-4b2a-8bc5-04e038078d7b-scripts\") pod \"73702eed-2df7-4b2a-8bc5-04e038078d7b\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.373235 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-scripts\") pod \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.373269 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/73702eed-2df7-4b2a-8bc5-04e038078d7b-horizon-secret-key\") pod \"73702eed-2df7-4b2a-8bc5-04e038078d7b\" (UID: \"73702eed-2df7-4b2a-8bc5-04e038078d7b\") " Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.373776 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73702eed-2df7-4b2a-8bc5-04e038078d7b-config-data" (OuterVolumeSpecName: "config-data") pod "73702eed-2df7-4b2a-8bc5-04e038078d7b" (UID: "73702eed-2df7-4b2a-8bc5-04e038078d7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.375375 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73702eed-2df7-4b2a-8bc5-04e038078d7b-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.376300 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73702eed-2df7-4b2a-8bc5-04e038078d7b-logs" (OuterVolumeSpecName: "logs") pod "73702eed-2df7-4b2a-8bc5-04e038078d7b" (UID: "73702eed-2df7-4b2a-8bc5-04e038078d7b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.402246 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8efffcc7-626b-47bf-aa44-c44a74763f9a-kube-api-access-gpnpb" (OuterVolumeSpecName: "kube-api-access-gpnpb") pod "8efffcc7-626b-47bf-aa44-c44a74763f9a" (UID: "8efffcc7-626b-47bf-aa44-c44a74763f9a"). InnerVolumeSpecName "kube-api-access-gpnpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.476512 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncncm\" (UniqueName: \"kubernetes.io/projected/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-kube-api-access-ncncm\") pod \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.476815 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-horizon-secret-key\") pod \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.476908 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-config-data\") pod \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.477451 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-logs\") pod \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\" (UID: \"9f6a7cd0-596f-4058-9bde-db8c55aca8c0\") " Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.477756 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-logs" (OuterVolumeSpecName: "logs") pod "9f6a7cd0-596f-4058-9bde-db8c55aca8c0" (UID: "9f6a7cd0-596f-4058-9bde-db8c55aca8c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.478471 4757 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-logs\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.478592 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpnpb\" (UniqueName: \"kubernetes.io/projected/8efffcc7-626b-47bf-aa44-c44a74763f9a-kube-api-access-gpnpb\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:21 crc kubenswrapper[4757]: I1216 13:07:21.478723 4757 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73702eed-2df7-4b2a-8bc5-04e038078d7b-logs\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.258673 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8efffcc7-626b-47bf-aa44-c44a74763f9a" (UID: "8efffcc7-626b-47bf-aa44-c44a74763f9a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.259270 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8efffcc7-626b-47bf-aa44-c44a74763f9a" (UID: "8efffcc7-626b-47bf-aa44-c44a74763f9a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.259372 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8efffcc7-626b-47bf-aa44-c44a74763f9a" (UID: "8efffcc7-626b-47bf-aa44-c44a74763f9a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.264328 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73702eed-2df7-4b2a-8bc5-04e038078d7b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "73702eed-2df7-4b2a-8bc5-04e038078d7b" (UID: "73702eed-2df7-4b2a-8bc5-04e038078d7b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.264672 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73702eed-2df7-4b2a-8bc5-04e038078d7b-scripts" (OuterVolumeSpecName: "scripts") pod "73702eed-2df7-4b2a-8bc5-04e038078d7b" (UID: "73702eed-2df7-4b2a-8bc5-04e038078d7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.265343 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-config-data" (OuterVolumeSpecName: "config-data") pod "9f6a7cd0-596f-4058-9bde-db8c55aca8c0" (UID: "9f6a7cd0-596f-4058-9bde-db8c55aca8c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.265384 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8efffcc7-626b-47bf-aa44-c44a74763f9a" (UID: "8efffcc7-626b-47bf-aa44-c44a74763f9a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.265644 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-scripts" (OuterVolumeSpecName: "scripts") pod "9f6a7cd0-596f-4058-9bde-db8c55aca8c0" (UID: "9f6a7cd0-596f-4058-9bde-db8c55aca8c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.265994 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-kube-api-access-ncncm" (OuterVolumeSpecName: "kube-api-access-ncncm") pod "9f6a7cd0-596f-4058-9bde-db8c55aca8c0" (UID: "9f6a7cd0-596f-4058-9bde-db8c55aca8c0"). InnerVolumeSpecName "kube-api-access-ncncm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.266091 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73702eed-2df7-4b2a-8bc5-04e038078d7b-kube-api-access-cfvdn" (OuterVolumeSpecName: "kube-api-access-cfvdn") pod "73702eed-2df7-4b2a-8bc5-04e038078d7b" (UID: "73702eed-2df7-4b2a-8bc5-04e038078d7b"). InnerVolumeSpecName "kube-api-access-cfvdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.266790 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9f6a7cd0-596f-4058-9bde-db8c55aca8c0" (UID: "9f6a7cd0-596f-4058-9bde-db8c55aca8c0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.272483 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-config" (OuterVolumeSpecName: "config") pod "8efffcc7-626b-47bf-aa44-c44a74763f9a" (UID: "8efffcc7-626b-47bf-aa44-c44a74763f9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.300200 4757 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.300240 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.300256 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncncm\" (UniqueName: \"kubernetes.io/projected/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-kube-api-access-ncncm\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.300271 4757 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.300283 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.300295 4757 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.300306 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfvdn\" (UniqueName: \"kubernetes.io/projected/73702eed-2df7-4b2a-8bc5-04e038078d7b-kube-api-access-cfvdn\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.300318 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/73702eed-2df7-4b2a-8bc5-04e038078d7b-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.300328 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f6a7cd0-596f-4058-9bde-db8c55aca8c0-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.300339 4757 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/73702eed-2df7-4b2a-8bc5-04e038078d7b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.300353 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.300366 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8efffcc7-626b-47bf-aa44-c44a74763f9a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.631069 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5b9cdf76cf-dw2p5"] Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.640983 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5b9cdf76cf-dw2p5"] Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.654523 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75ccc7d896-jmrk9"] Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.673609 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77668464d5-b2zbm"] Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.682254 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-77668464d5-b2zbm"] Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.689998 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-4rpn8"] Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.699592 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-4rpn8"] Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.718041 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-4rpn8" podUID="8efffcc7-626b-47bf-aa44-c44a74763f9a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.959381 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73702eed-2df7-4b2a-8bc5-04e038078d7b" path="/var/lib/kubelet/pods/73702eed-2df7-4b2a-8bc5-04e038078d7b/volumes" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.959944 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8efffcc7-626b-47bf-aa44-c44a74763f9a" path="/var/lib/kubelet/pods/8efffcc7-626b-47bf-aa44-c44a74763f9a/volumes" Dec 16 13:07:22 crc kubenswrapper[4757]: I1216 13:07:22.960626 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f6a7cd0-596f-4058-9bde-db8c55aca8c0" path="/var/lib/kubelet/pods/9f6a7cd0-596f-4058-9bde-db8c55aca8c0/volumes" Dec 16 13:07:22 crc kubenswrapper[4757]: E1216 13:07:22.998273 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 16 13:07:22 crc kubenswrapper[4757]: E1216 13:07:22.998650 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5w9r5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-tjbz5_openstack(362eaecb-4139-44f9-a651-3e14cc2d6ae2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:07:22 crc kubenswrapper[4757]: E1216 13:07:22.999964 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-tjbz5" podUID="362eaecb-4139-44f9-a651-3e14cc2d6ae2" Dec 16 13:07:23 crc kubenswrapper[4757]: E1216 13:07:23.306855 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 16 13:07:23 crc kubenswrapper[4757]: E1216 13:07:23.307051 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n75h668hdbh649h59fh5ch66chc8h57fh7dh544h697hfhc6h685h7ch5cfh76h59bh67dh544h55h687h69hf5hd8h54ch66fh5f6hc4h65bhbcq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vs2rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(32bc2074-3d53-44b4-8e6c-c500a2617944): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:07:23 crc kubenswrapper[4757]: E1216 13:07:23.356794 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-tjbz5" podUID="362eaecb-4139-44f9-a651-3e14cc2d6ae2" Dec 16 13:07:23 crc kubenswrapper[4757]: I1216 13:07:23.402222 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:07:23 crc kubenswrapper[4757]: I1216 13:07:23.426946 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa69e580-99e4-454b-9fa8-906d0410aea5-config-data\") pod \"aa69e580-99e4-454b-9fa8-906d0410aea5\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " Dec 16 13:07:23 crc kubenswrapper[4757]: I1216 13:07:23.427339 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa69e580-99e4-454b-9fa8-906d0410aea5-scripts\") pod \"aa69e580-99e4-454b-9fa8-906d0410aea5\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " Dec 16 13:07:23 crc kubenswrapper[4757]: I1216 13:07:23.427420 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skgsj\" (UniqueName: \"kubernetes.io/projected/aa69e580-99e4-454b-9fa8-906d0410aea5-kube-api-access-skgsj\") pod \"aa69e580-99e4-454b-9fa8-906d0410aea5\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " Dec 16 13:07:23 crc kubenswrapper[4757]: I1216 13:07:23.427467 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa69e580-99e4-454b-9fa8-906d0410aea5-horizon-secret-key\") pod \"aa69e580-99e4-454b-9fa8-906d0410aea5\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " Dec 16 13:07:23 crc kubenswrapper[4757]: I1216 13:07:23.427625 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa69e580-99e4-454b-9fa8-906d0410aea5-logs\") pod \"aa69e580-99e4-454b-9fa8-906d0410aea5\" (UID: \"aa69e580-99e4-454b-9fa8-906d0410aea5\") " Dec 16 13:07:23 crc kubenswrapper[4757]: I1216 13:07:23.427879 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa69e580-99e4-454b-9fa8-906d0410aea5-scripts" (OuterVolumeSpecName: "scripts") pod "aa69e580-99e4-454b-9fa8-906d0410aea5" (UID: "aa69e580-99e4-454b-9fa8-906d0410aea5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:07:23 crc kubenswrapper[4757]: I1216 13:07:23.428230 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa69e580-99e4-454b-9fa8-906d0410aea5-config-data" (OuterVolumeSpecName: "config-data") pod "aa69e580-99e4-454b-9fa8-906d0410aea5" (UID: "aa69e580-99e4-454b-9fa8-906d0410aea5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:07:23 crc kubenswrapper[4757]: I1216 13:07:23.428243 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa69e580-99e4-454b-9fa8-906d0410aea5-logs" (OuterVolumeSpecName: "logs") pod "aa69e580-99e4-454b-9fa8-906d0410aea5" (UID: "aa69e580-99e4-454b-9fa8-906d0410aea5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:07:23 crc kubenswrapper[4757]: I1216 13:07:23.428502 4757 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa69e580-99e4-454b-9fa8-906d0410aea5-logs\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:23 crc kubenswrapper[4757]: I1216 13:07:23.428523 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa69e580-99e4-454b-9fa8-906d0410aea5-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:23 crc kubenswrapper[4757]: I1216 13:07:23.428553 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa69e580-99e4-454b-9fa8-906d0410aea5-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:23 crc kubenswrapper[4757]: I1216 13:07:23.434514 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa69e580-99e4-454b-9fa8-906d0410aea5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "aa69e580-99e4-454b-9fa8-906d0410aea5" (UID: "aa69e580-99e4-454b-9fa8-906d0410aea5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:23 crc kubenswrapper[4757]: I1216 13:07:23.437248 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa69e580-99e4-454b-9fa8-906d0410aea5-kube-api-access-skgsj" (OuterVolumeSpecName: "kube-api-access-skgsj") pod "aa69e580-99e4-454b-9fa8-906d0410aea5" (UID: "aa69e580-99e4-454b-9fa8-906d0410aea5"). InnerVolumeSpecName "kube-api-access-skgsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:07:23 crc kubenswrapper[4757]: I1216 13:07:23.529842 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skgsj\" (UniqueName: \"kubernetes.io/projected/aa69e580-99e4-454b-9fa8-906d0410aea5-kube-api-access-skgsj\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:23 crc kubenswrapper[4757]: I1216 13:07:23.530063 4757 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa69e580-99e4-454b-9fa8-906d0410aea5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:24 crc kubenswrapper[4757]: I1216 13:07:24.363162 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8d6465865-4lfwj" Dec 16 13:07:24 crc kubenswrapper[4757]: I1216 13:07:24.363146 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8d6465865-4lfwj" event={"ID":"aa69e580-99e4-454b-9fa8-906d0410aea5","Type":"ContainerDied","Data":"05078933b163c5ea0454940374b93dad47813396739081db34021c3a30ad5a9f"} Dec 16 13:07:24 crc kubenswrapper[4757]: I1216 13:07:24.430190 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8d6465865-4lfwj"] Dec 16 13:07:24 crc kubenswrapper[4757]: I1216 13:07:24.442213 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8d6465865-4lfwj"] Dec 16 13:07:24 crc kubenswrapper[4757]: I1216 13:07:24.960072 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa69e580-99e4-454b-9fa8-906d0410aea5" path="/var/lib/kubelet/pods/aa69e580-99e4-454b-9fa8-906d0410aea5/volumes" Dec 16 13:07:26 crc kubenswrapper[4757]: I1216 13:07:26.455251 4757 scope.go:117] "RemoveContainer" containerID="37e2c93447e8030756b33da98cea9caebf652010794bc2f62bec0274660d9053" Dec 16 13:07:26 crc kubenswrapper[4757]: I1216 13:07:26.543893 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75ccc7d896-jmrk9" event={"ID":"399f2693-64b1-4958-ad75-49c45b448ed5","Type":"ContainerStarted","Data":"9a841203e84fb18d5fece40a98bea67abcafaf0ddc9e93031e6806f863f1c521"} Dec 16 13:07:26 crc kubenswrapper[4757]: I1216 13:07:26.972986 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kg598"] Dec 16 13:07:26 crc kubenswrapper[4757]: W1216 13:07:26.980197 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod082914b4_7f60_4d23_98ec_51f3c8a831aa.slice/crio-5b9feb52f7041becd238d54ea90fd4666e7207e9d14642dbd27bb45bd464620f WatchSource:0}: Error finding container 5b9feb52f7041becd238d54ea90fd4666e7207e9d14642dbd27bb45bd464620f: Status 404 returned error can't find the container with id 5b9feb52f7041becd238d54ea90fd4666e7207e9d14642dbd27bb45bd464620f Dec 16 13:07:27 crc kubenswrapper[4757]: I1216 13:07:27.133519 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d66ddf65b-lmltr"] Dec 16 13:07:27 crc kubenswrapper[4757]: W1216 13:07:27.141169 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65337bd1_c674_4817_91c2_ad150639205c.slice/crio-2e91315cae6548fc98278123a5eaa97a358c083f59e977b01f1de73104ec2080 WatchSource:0}: Error finding container 2e91315cae6548fc98278123a5eaa97a358c083f59e977b01f1de73104ec2080: Status 404 returned error can't find the container with id 2e91315cae6548fc98278123a5eaa97a358c083f59e977b01f1de73104ec2080 Dec 16 13:07:27 crc kubenswrapper[4757]: I1216 13:07:27.561244 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d66ddf65b-lmltr" event={"ID":"65337bd1-c674-4817-91c2-ad150639205c","Type":"ContainerStarted","Data":"2e91315cae6548fc98278123a5eaa97a358c083f59e977b01f1de73104ec2080"} Dec 16 13:07:27 crc kubenswrapper[4757]: I1216 13:07:27.562344 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kg598" event={"ID":"082914b4-7f60-4d23-98ec-51f3c8a831aa","Type":"ContainerStarted","Data":"5b9feb52f7041becd238d54ea90fd4666e7207e9d14642dbd27bb45bd464620f"} Dec 16 13:07:29 crc kubenswrapper[4757]: I1216 13:07:29.581927 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a1cc3bc2-c644-4732-a745-2c63515caf83","Type":"ContainerStarted","Data":"f2acd470c87b7435dcbc86ea09ffd0537f79e20a293832e3b9098f1b7c35a334"} Dec 16 13:07:29 crc kubenswrapper[4757]: I1216 13:07:29.586521 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c98dac12-1368-4c66-9456-0e364ca153d7","Type":"ContainerStarted","Data":"d2e8f63fa3e71ccf4b9e0520b1f72ef5e4e1b33def78c2b7b87705ba63896bf8"} Dec 16 13:07:30 crc kubenswrapper[4757]: I1216 13:07:30.597525 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kg598" event={"ID":"082914b4-7f60-4d23-98ec-51f3c8a831aa","Type":"ContainerStarted","Data":"ad9d80b9af09d564a6151170db5443c56272b5afdd1aefa55ddbbd775f41a0aa"} Dec 16 13:07:30 crc kubenswrapper[4757]: I1216 13:07:30.597655 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c98dac12-1368-4c66-9456-0e364ca153d7" containerName="glance-httpd" containerID="cri-o://d2e8f63fa3e71ccf4b9e0520b1f72ef5e4e1b33def78c2b7b87705ba63896bf8" gracePeriod=30 Dec 16 13:07:30 crc kubenswrapper[4757]: I1216 13:07:30.597863 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c98dac12-1368-4c66-9456-0e364ca153d7" containerName="glance-log" containerID="cri-o://e2f940cded57ea9914d909a7a61ce4ba1e6e652a540bee524f68152d3e7c3db2" gracePeriod=30 Dec 16 13:07:30 crc kubenswrapper[4757]: I1216 13:07:30.633248 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=48.63322395 podStartE2EDuration="48.63322395s" podCreationTimestamp="2025-12-16 13:06:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:07:30.627726717 +0000 UTC m=+1236.055470533" watchObservedRunningTime="2025-12-16 13:07:30.63322395 +0000 UTC m=+1236.060967746" Dec 16 13:07:31 crc kubenswrapper[4757]: I1216 13:07:31.607606 4757 generic.go:334] "Generic (PLEG): container finished" podID="c98dac12-1368-4c66-9456-0e364ca153d7" containerID="e2f940cded57ea9914d909a7a61ce4ba1e6e652a540bee524f68152d3e7c3db2" exitCode=143 Dec 16 13:07:31 crc kubenswrapper[4757]: I1216 13:07:31.607735 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c98dac12-1368-4c66-9456-0e364ca153d7","Type":"ContainerDied","Data":"e2f940cded57ea9914d909a7a61ce4ba1e6e652a540bee524f68152d3e7c3db2"} Dec 16 13:07:31 crc kubenswrapper[4757]: I1216 13:07:31.608956 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a1cc3bc2-c644-4732-a745-2c63515caf83" containerName="glance-log" containerID="cri-o://b5ad814ce268cbd19cec50bb8eba771876e5abb2a23e2154408ff781ef72f60f" gracePeriod=30 Dec 16 13:07:31 crc kubenswrapper[4757]: I1216 13:07:31.609110 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a1cc3bc2-c644-4732-a745-2c63515caf83" containerName="glance-httpd" containerID="cri-o://f2acd470c87b7435dcbc86ea09ffd0537f79e20a293832e3b9098f1b7c35a334" gracePeriod=30 Dec 16 13:07:31 crc kubenswrapper[4757]: I1216 13:07:31.636549 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=50.63651049 podStartE2EDuration="50.63651049s" podCreationTimestamp="2025-12-16 13:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:07:31.625315881 +0000 UTC m=+1237.053059687" watchObservedRunningTime="2025-12-16 13:07:31.63651049 +0000 UTC m=+1237.064254286" Dec 16 13:07:31 crc kubenswrapper[4757]: E1216 13:07:31.973333 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 16 13:07:31 crc kubenswrapper[4757]: E1216 13:07:31.973529 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bpnll,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-7lhwd_openstack(e48bb858-bd6e-4dbc-a17a-5fd5e1275e00): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:07:31 crc kubenswrapper[4757]: E1216 13:07:31.974872 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-7lhwd" podUID="e48bb858-bd6e-4dbc-a17a-5fd5e1275e00" Dec 16 13:07:32 crc kubenswrapper[4757]: I1216 13:07:32.618886 4757 generic.go:334] "Generic (PLEG): container finished" podID="a1cc3bc2-c644-4732-a745-2c63515caf83" containerID="f2acd470c87b7435dcbc86ea09ffd0537f79e20a293832e3b9098f1b7c35a334" exitCode=0 Dec 16 13:07:32 crc kubenswrapper[4757]: I1216 13:07:32.619211 4757 generic.go:334] "Generic (PLEG): container finished" podID="a1cc3bc2-c644-4732-a745-2c63515caf83" containerID="b5ad814ce268cbd19cec50bb8eba771876e5abb2a23e2154408ff781ef72f60f" exitCode=143 Dec 16 13:07:32 crc kubenswrapper[4757]: I1216 13:07:32.618949 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a1cc3bc2-c644-4732-a745-2c63515caf83","Type":"ContainerDied","Data":"f2acd470c87b7435dcbc86ea09ffd0537f79e20a293832e3b9098f1b7c35a334"} Dec 16 13:07:32 crc kubenswrapper[4757]: I1216 13:07:32.619287 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a1cc3bc2-c644-4732-a745-2c63515caf83","Type":"ContainerDied","Data":"b5ad814ce268cbd19cec50bb8eba771876e5abb2a23e2154408ff781ef72f60f"} Dec 16 13:07:32 crc kubenswrapper[4757]: I1216 13:07:32.621489 4757 generic.go:334] "Generic (PLEG): container finished" podID="c98dac12-1368-4c66-9456-0e364ca153d7" containerID="d2e8f63fa3e71ccf4b9e0520b1f72ef5e4e1b33def78c2b7b87705ba63896bf8" exitCode=0 Dec 16 13:07:32 crc kubenswrapper[4757]: I1216 13:07:32.621511 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c98dac12-1368-4c66-9456-0e364ca153d7","Type":"ContainerDied","Data":"d2e8f63fa3e71ccf4b9e0520b1f72ef5e4e1b33def78c2b7b87705ba63896bf8"} Dec 16 13:07:32 crc kubenswrapper[4757]: I1216 13:07:32.662754 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kg598" podStartSLOduration=21.662733361 podStartE2EDuration="21.662733361s" podCreationTimestamp="2025-12-16 13:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:07:32.657407462 +0000 UTC m=+1238.085151258" watchObservedRunningTime="2025-12-16 13:07:32.662733361 +0000 UTC m=+1238.090477157" Dec 16 13:07:42 crc kubenswrapper[4757]: I1216 13:07:42.926214 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 13:07:42 crc kubenswrapper[4757]: I1216 13:07:42.926667 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 13:07:43 crc kubenswrapper[4757]: I1216 13:07:43.509101 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 13:07:43 crc kubenswrapper[4757]: I1216 13:07:43.509478 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.435658 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.441778 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.584195 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c98dac12-1368-4c66-9456-0e364ca153d7-httpd-run\") pod \"c98dac12-1368-4c66-9456-0e364ca153d7\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.584572 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"c98dac12-1368-4c66-9456-0e364ca153d7\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.585785 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c98dac12-1368-4c66-9456-0e364ca153d7-logs\") pod \"c98dac12-1368-4c66-9456-0e364ca153d7\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.584802 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c98dac12-1368-4c66-9456-0e364ca153d7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c98dac12-1368-4c66-9456-0e364ca153d7" (UID: "c98dac12-1368-4c66-9456-0e364ca153d7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.586123 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98dac12-1368-4c66-9456-0e364ca153d7-combined-ca-bundle\") pod \"c98dac12-1368-4c66-9456-0e364ca153d7\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.586183 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c98dac12-1368-4c66-9456-0e364ca153d7-logs" (OuterVolumeSpecName: "logs") pod "c98dac12-1368-4c66-9456-0e364ca153d7" (UID: "c98dac12-1368-4c66-9456-0e364ca153d7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.586427 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1cc3bc2-c644-4732-a745-2c63515caf83-httpd-run\") pod \"a1cc3bc2-c644-4732-a745-2c63515caf83\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.586609 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c98dac12-1368-4c66-9456-0e364ca153d7-scripts\") pod \"c98dac12-1368-4c66-9456-0e364ca153d7\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.586714 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mldc\" (UniqueName: \"kubernetes.io/projected/a1cc3bc2-c644-4732-a745-2c63515caf83-kube-api-access-7mldc\") pod \"a1cc3bc2-c644-4732-a745-2c63515caf83\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.586816 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1cc3bc2-c644-4732-a745-2c63515caf83-logs\") pod \"a1cc3bc2-c644-4732-a745-2c63515caf83\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.586936 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1cc3bc2-c644-4732-a745-2c63515caf83-config-data\") pod \"a1cc3bc2-c644-4732-a745-2c63515caf83\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.586843 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1cc3bc2-c644-4732-a745-2c63515caf83-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a1cc3bc2-c644-4732-a745-2c63515caf83" (UID: "a1cc3bc2-c644-4732-a745-2c63515caf83"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.587173 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1cc3bc2-c644-4732-a745-2c63515caf83-logs" (OuterVolumeSpecName: "logs") pod "a1cc3bc2-c644-4732-a745-2c63515caf83" (UID: "a1cc3bc2-c644-4732-a745-2c63515caf83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.587993 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98dac12-1368-4c66-9456-0e364ca153d7-config-data\") pod \"c98dac12-1368-4c66-9456-0e364ca153d7\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.588428 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1cc3bc2-c644-4732-a745-2c63515caf83-scripts\") pod \"a1cc3bc2-c644-4732-a745-2c63515caf83\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.588555 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1cc3bc2-c644-4732-a745-2c63515caf83-combined-ca-bundle\") pod \"a1cc3bc2-c644-4732-a745-2c63515caf83\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.588662 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ncnx\" (UniqueName: \"kubernetes.io/projected/c98dac12-1368-4c66-9456-0e364ca153d7-kube-api-access-8ncnx\") pod \"c98dac12-1368-4c66-9456-0e364ca153d7\" (UID: \"c98dac12-1368-4c66-9456-0e364ca153d7\") " Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.588762 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"a1cc3bc2-c644-4732-a745-2c63515caf83\" (UID: \"a1cc3bc2-c644-4732-a745-2c63515caf83\") " Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.589427 4757 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c98dac12-1368-4c66-9456-0e364ca153d7-logs\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.590131 4757 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1cc3bc2-c644-4732-a745-2c63515caf83-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.591464 4757 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1cc3bc2-c644-4732-a745-2c63515caf83-logs\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.591677 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1cc3bc2-c644-4732-a745-2c63515caf83-kube-api-access-7mldc" (OuterVolumeSpecName: "kube-api-access-7mldc") pod "a1cc3bc2-c644-4732-a745-2c63515caf83" (UID: "a1cc3bc2-c644-4732-a745-2c63515caf83"). InnerVolumeSpecName "kube-api-access-7mldc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.591705 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "c98dac12-1368-4c66-9456-0e364ca153d7" (UID: "c98dac12-1368-4c66-9456-0e364ca153d7"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.591580 4757 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c98dac12-1368-4c66-9456-0e364ca153d7-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.594035 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98dac12-1368-4c66-9456-0e364ca153d7-scripts" (OuterVolumeSpecName: "scripts") pod "c98dac12-1368-4c66-9456-0e364ca153d7" (UID: "c98dac12-1368-4c66-9456-0e364ca153d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.601484 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "a1cc3bc2-c644-4732-a745-2c63515caf83" (UID: "a1cc3bc2-c644-4732-a745-2c63515caf83"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.602411 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c98dac12-1368-4c66-9456-0e364ca153d7-kube-api-access-8ncnx" (OuterVolumeSpecName: "kube-api-access-8ncnx") pod "c98dac12-1368-4c66-9456-0e364ca153d7" (UID: "c98dac12-1368-4c66-9456-0e364ca153d7"). InnerVolumeSpecName "kube-api-access-8ncnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.606266 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1cc3bc2-c644-4732-a745-2c63515caf83-scripts" (OuterVolumeSpecName: "scripts") pod "a1cc3bc2-c644-4732-a745-2c63515caf83" (UID: "a1cc3bc2-c644-4732-a745-2c63515caf83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.648596 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1cc3bc2-c644-4732-a745-2c63515caf83-config-data" (OuterVolumeSpecName: "config-data") pod "a1cc3bc2-c644-4732-a745-2c63515caf83" (UID: "a1cc3bc2-c644-4732-a745-2c63515caf83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.649067 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98dac12-1368-4c66-9456-0e364ca153d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c98dac12-1368-4c66-9456-0e364ca153d7" (UID: "c98dac12-1368-4c66-9456-0e364ca153d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.666056 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1cc3bc2-c644-4732-a745-2c63515caf83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1cc3bc2-c644-4732-a745-2c63515caf83" (UID: "a1cc3bc2-c644-4732-a745-2c63515caf83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.666745 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98dac12-1368-4c66-9456-0e364ca153d7-config-data" (OuterVolumeSpecName: "config-data") pod "c98dac12-1368-4c66-9456-0e364ca153d7" (UID: "c98dac12-1368-4c66-9456-0e364ca153d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.695615 4757 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.696201 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98dac12-1368-4c66-9456-0e364ca153d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.696355 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c98dac12-1368-4c66-9456-0e364ca153d7-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.696498 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mldc\" (UniqueName: \"kubernetes.io/projected/a1cc3bc2-c644-4732-a745-2c63515caf83-kube-api-access-7mldc\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.696611 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1cc3bc2-c644-4732-a745-2c63515caf83-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.697040 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98dac12-1368-4c66-9456-0e364ca153d7-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.697128 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1cc3bc2-c644-4732-a745-2c63515caf83-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.697214 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1cc3bc2-c644-4732-a745-2c63515caf83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.697325 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ncnx\" (UniqueName: \"kubernetes.io/projected/c98dac12-1368-4c66-9456-0e364ca153d7-kube-api-access-8ncnx\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.697429 4757 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.716903 4757 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.718349 4757 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.726196 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a1cc3bc2-c644-4732-a745-2c63515caf83","Type":"ContainerDied","Data":"7bae95c1593a3ccb3a00af7875561b3f66bc37eee0f79e026d090848958b0ac7"} Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.726247 4757 scope.go:117] "RemoveContainer" containerID="f2acd470c87b7435dcbc86ea09ffd0537f79e20a293832e3b9098f1b7c35a334" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.726357 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.731261 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c98dac12-1368-4c66-9456-0e364ca153d7","Type":"ContainerDied","Data":"f852324ad0486f9cd6d82e47c132aab181a0e53a5dc6b046a01b9595adc60faf"} Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.731342 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.800948 4757 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.800984 4757 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.804078 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.814014 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 13:07:45 crc kubenswrapper[4757]: E1216 13:07:45.821523 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified" Dec 16 13:07:45 crc kubenswrapper[4757]: E1216 13:07:45.821703 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-notification-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n75h668hdbh649h59fh5ch66chc8h57fh7dh544h697hfhc6h685h7ch5cfh76h59bh67dh544h55h687h69hf5hd8h54ch66fh5f6hc4h65bhbcq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-notification-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vs2rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/notificationhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(32bc2074-3d53-44b4-8e6c-c500a2617944): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.823467 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.835251 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.847244 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 13:07:45 crc kubenswrapper[4757]: E1216 13:07:45.847819 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1cc3bc2-c644-4732-a745-2c63515caf83" containerName="glance-log" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.847840 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1cc3bc2-c644-4732-a745-2c63515caf83" containerName="glance-log" Dec 16 13:07:45 crc kubenswrapper[4757]: E1216 13:07:45.847853 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98dac12-1368-4c66-9456-0e364ca153d7" containerName="glance-log" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.847862 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98dac12-1368-4c66-9456-0e364ca153d7" containerName="glance-log" Dec 16 13:07:45 crc kubenswrapper[4757]: E1216 13:07:45.847889 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8efffcc7-626b-47bf-aa44-c44a74763f9a" containerName="dnsmasq-dns" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.847896 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="8efffcc7-626b-47bf-aa44-c44a74763f9a" containerName="dnsmasq-dns" Dec 16 13:07:45 crc kubenswrapper[4757]: E1216 13:07:45.847914 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1cc3bc2-c644-4732-a745-2c63515caf83" containerName="glance-httpd" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.847921 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1cc3bc2-c644-4732-a745-2c63515caf83" containerName="glance-httpd" Dec 16 13:07:45 crc kubenswrapper[4757]: E1216 13:07:45.847943 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8efffcc7-626b-47bf-aa44-c44a74763f9a" containerName="init" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.847950 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="8efffcc7-626b-47bf-aa44-c44a74763f9a" containerName="init" Dec 16 13:07:45 crc kubenswrapper[4757]: E1216 13:07:45.847963 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98dac12-1368-4c66-9456-0e364ca153d7" containerName="glance-httpd" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.847970 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98dac12-1368-4c66-9456-0e364ca153d7" containerName="glance-httpd" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.848195 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1cc3bc2-c644-4732-a745-2c63515caf83" containerName="glance-httpd" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.848216 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1cc3bc2-c644-4732-a745-2c63515caf83" containerName="glance-log" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.848274 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98dac12-1368-4c66-9456-0e364ca153d7" containerName="glance-log" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.848329 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="8efffcc7-626b-47bf-aa44-c44a74763f9a" containerName="dnsmasq-dns" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.848341 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98dac12-1368-4c66-9456-0e364ca153d7" containerName="glance-httpd" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.852372 4757 scope.go:117] "RemoveContainer" containerID="b5ad814ce268cbd19cec50bb8eba771876e5abb2a23e2154408ff781ef72f60f" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.855451 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.855893 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.863067 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.863338 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.870781 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.871330 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.871464 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.871584 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.871597 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.871701 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-24n8w" Dec 16 13:07:45 crc kubenswrapper[4757]: I1216 13:07:45.943076 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.005674 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/da952307-39db-4816-8465-d931bd94436d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.006239 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.006332 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flk28\" (UniqueName: \"kubernetes.io/projected/9f22fc73-c034-4c9d-8274-215e0ef2a208-kube-api-access-flk28\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.006398 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-scripts\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.006467 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dkpg\" (UniqueName: \"kubernetes.io/projected/da952307-39db-4816-8465-d931bd94436d-kube-api-access-4dkpg\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.006552 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.006635 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f22fc73-c034-4c9d-8274-215e0ef2a208-logs\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.006721 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.006856 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.006953 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f22fc73-c034-4c9d-8274-215e0ef2a208-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.007058 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.007247 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.007334 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-config-data\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.007470 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.007528 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da952307-39db-4816-8465-d931bd94436d-logs\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.007571 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.012979 4757 scope.go:117] "RemoveContainer" containerID="d2e8f63fa3e71ccf4b9e0520b1f72ef5e4e1b33def78c2b7b87705ba63896bf8" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.108819 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da952307-39db-4816-8465-d931bd94436d-logs\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.108867 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.108898 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/da952307-39db-4816-8465-d931bd94436d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.108919 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.108962 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flk28\" (UniqueName: \"kubernetes.io/projected/9f22fc73-c034-4c9d-8274-215e0ef2a208-kube-api-access-flk28\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.108976 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-scripts\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.108993 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dkpg\" (UniqueName: \"kubernetes.io/projected/da952307-39db-4816-8465-d931bd94436d-kube-api-access-4dkpg\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.109028 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.109053 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f22fc73-c034-4c9d-8274-215e0ef2a208-logs\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.109069 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.109099 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.109116 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f22fc73-c034-4c9d-8274-215e0ef2a208-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.109143 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.109159 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.109176 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-config-data\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.109229 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.109400 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da952307-39db-4816-8465-d931bd94436d-logs\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.109423 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/da952307-39db-4816-8465-d931bd94436d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.109532 4757 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.109793 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f22fc73-c034-4c9d-8274-215e0ef2a208-logs\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.110863 4757 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.111287 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f22fc73-c034-4c9d-8274-215e0ef2a208-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.115804 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.116264 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-scripts\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.121259 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.121526 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.123858 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.124501 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.125879 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.126262 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flk28\" (UniqueName: \"kubernetes.io/projected/9f22fc73-c034-4c9d-8274-215e0ef2a208-kube-api-access-flk28\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.128103 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-config-data\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.129425 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dkpg\" (UniqueName: \"kubernetes.io/projected/da952307-39db-4816-8465-d931bd94436d-kube-api-access-4dkpg\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.139845 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.160225 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.201881 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.236778 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.562731 4757 scope.go:117] "RemoveContainer" containerID="e2f940cded57ea9914d909a7a61ce4ba1e6e652a540bee524f68152d3e7c3db2" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.763229 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d66ddf65b-lmltr" event={"ID":"65337bd1-c674-4817-91c2-ad150639205c","Type":"ContainerStarted","Data":"68083e5e4fceed20d93854ee8f903c264c2a482888dea70804815fb85cf9ddb7"} Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.982123 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1cc3bc2-c644-4732-a745-2c63515caf83" path="/var/lib/kubelet/pods/a1cc3bc2-c644-4732-a745-2c63515caf83/volumes" Dec 16 13:07:46 crc kubenswrapper[4757]: I1216 13:07:46.983261 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c98dac12-1368-4c66-9456-0e364ca153d7" path="/var/lib/kubelet/pods/c98dac12-1368-4c66-9456-0e364ca153d7/volumes" Dec 16 13:07:47 crc kubenswrapper[4757]: I1216 13:07:47.232734 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 13:07:47 crc kubenswrapper[4757]: I1216 13:07:47.330954 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 13:07:47 crc kubenswrapper[4757]: W1216 13:07:47.550131 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f22fc73_c034_4c9d_8274_215e0ef2a208.slice/crio-2c7545ccd06e40a0e3703e23eb33f7a3c40e2215e88f6a67d1cacfcb92b07af9 WatchSource:0}: Error finding container 2c7545ccd06e40a0e3703e23eb33f7a3c40e2215e88f6a67d1cacfcb92b07af9: Status 404 returned error can't find the container with id 2c7545ccd06e40a0e3703e23eb33f7a3c40e2215e88f6a67d1cacfcb92b07af9 Dec 16 13:07:47 crc kubenswrapper[4757]: I1216 13:07:47.799029 4757 generic.go:334] "Generic (PLEG): container finished" podID="082914b4-7f60-4d23-98ec-51f3c8a831aa" containerID="ad9d80b9af09d564a6151170db5443c56272b5afdd1aefa55ddbbd775f41a0aa" exitCode=0 Dec 16 13:07:47 crc kubenswrapper[4757]: I1216 13:07:47.799083 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kg598" event={"ID":"082914b4-7f60-4d23-98ec-51f3c8a831aa","Type":"ContainerDied","Data":"ad9d80b9af09d564a6151170db5443c56272b5afdd1aefa55ddbbd775f41a0aa"} Dec 16 13:07:47 crc kubenswrapper[4757]: I1216 13:07:47.804597 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7lhwd" event={"ID":"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00","Type":"ContainerStarted","Data":"0e351d201b95334e733de4822ee4dfa2d43cbb85bf172b683744b901cb0cd0e8"} Dec 16 13:07:47 crc kubenswrapper[4757]: I1216 13:07:47.808288 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75ccc7d896-jmrk9" event={"ID":"399f2693-64b1-4958-ad75-49c45b448ed5","Type":"ContainerStarted","Data":"27930a46a589b4d3638de24e488861d0ce2d79305f0f2fcc652982d005f8b8df"} Dec 16 13:07:47 crc kubenswrapper[4757]: I1216 13:07:47.812868 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d66ddf65b-lmltr" event={"ID":"65337bd1-c674-4817-91c2-ad150639205c","Type":"ContainerStarted","Data":"357eee533356136a47d50ddfe4b10cb06996ef66793b34da2123f1bd22018055"} Dec 16 13:07:47 crc kubenswrapper[4757]: I1216 13:07:47.816928 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-86cds" event={"ID":"25d57428-c378-4e57-87c5-f1fff2398cec","Type":"ContainerStarted","Data":"f20ef5e006c72abb7a0a7a3f3c95380816bf64afd8ac9ee819347f5406e6e010"} Dec 16 13:07:47 crc kubenswrapper[4757]: I1216 13:07:47.845274 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f22fc73-c034-4c9d-8274-215e0ef2a208","Type":"ContainerStarted","Data":"2c7545ccd06e40a0e3703e23eb33f7a3c40e2215e88f6a67d1cacfcb92b07af9"} Dec 16 13:07:47 crc kubenswrapper[4757]: I1216 13:07:47.849583 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-86cds" podStartSLOduration=3.3743218759999998 podStartE2EDuration="1m9.849564061s" podCreationTimestamp="2025-12-16 13:06:38 +0000 UTC" firstStartedPulling="2025-12-16 13:06:39.393592499 +0000 UTC m=+1184.821336295" lastFinishedPulling="2025-12-16 13:07:45.868834684 +0000 UTC m=+1251.296578480" observedRunningTime="2025-12-16 13:07:47.84545809 +0000 UTC m=+1253.273201886" watchObservedRunningTime="2025-12-16 13:07:47.849564061 +0000 UTC m=+1253.277307857" Dec 16 13:07:47 crc kubenswrapper[4757]: I1216 13:07:47.850586 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"da952307-39db-4816-8465-d931bd94436d","Type":"ContainerStarted","Data":"bce010abd62e016254d72186f5a6337abfac7af02831fadc656261a40cb29f0f"} Dec 16 13:07:47 crc kubenswrapper[4757]: I1216 13:07:47.872833 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5d66ddf65b-lmltr" podStartSLOduration=38.16462138 podStartE2EDuration="56.872810537s" podCreationTimestamp="2025-12-16 13:06:51 +0000 UTC" firstStartedPulling="2025-12-16 13:07:27.14371991 +0000 UTC m=+1232.571463706" lastFinishedPulling="2025-12-16 13:07:45.851908857 +0000 UTC m=+1251.279652863" observedRunningTime="2025-12-16 13:07:47.872784977 +0000 UTC m=+1253.300528773" watchObservedRunningTime="2025-12-16 13:07:47.872810537 +0000 UTC m=+1253.300554333" Dec 16 13:07:48 crc kubenswrapper[4757]: I1216 13:07:48.866199 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75ccc7d896-jmrk9" event={"ID":"399f2693-64b1-4958-ad75-49c45b448ed5","Type":"ContainerStarted","Data":"25ab2be0f535088a98cd5974a570660c2b1ab7f032761874bdf1659a40210f03"} Dec 16 13:07:48 crc kubenswrapper[4757]: I1216 13:07:48.870032 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"da952307-39db-4816-8465-d931bd94436d","Type":"ContainerStarted","Data":"b2a06ac9d2ae490290b06680078382d77bf03896a450c8d9219e69834fa8ffaf"} Dec 16 13:07:48 crc kubenswrapper[4757]: I1216 13:07:48.872163 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tjbz5" event={"ID":"362eaecb-4139-44f9-a651-3e14cc2d6ae2","Type":"ContainerStarted","Data":"5e198d54c33d19283c1b227ff4c28be1278e25af8630e271c1f46e46d1980127"} Dec 16 13:07:48 crc kubenswrapper[4757]: I1216 13:07:48.915935 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-75ccc7d896-jmrk9" podStartSLOduration=38.475012692 podStartE2EDuration="57.915909743s" podCreationTimestamp="2025-12-16 13:06:51 +0000 UTC" firstStartedPulling="2025-12-16 13:07:26.321547027 +0000 UTC m=+1231.749290823" lastFinishedPulling="2025-12-16 13:07:45.762444078 +0000 UTC m=+1251.190187874" observedRunningTime="2025-12-16 13:07:48.895711575 +0000 UTC m=+1254.323455371" watchObservedRunningTime="2025-12-16 13:07:48.915909743 +0000 UTC m=+1254.343653539" Dec 16 13:07:48 crc kubenswrapper[4757]: I1216 13:07:48.931988 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-7lhwd" podStartSLOduration=4.686274852 podStartE2EDuration="1m11.931970071s" podCreationTimestamp="2025-12-16 13:06:37 +0000 UTC" firstStartedPulling="2025-12-16 13:06:39.353838114 +0000 UTC m=+1184.781581900" lastFinishedPulling="2025-12-16 13:07:46.599533323 +0000 UTC m=+1252.027277119" observedRunningTime="2025-12-16 13:07:48.913148493 +0000 UTC m=+1254.340892289" watchObservedRunningTime="2025-12-16 13:07:48.931970071 +0000 UTC m=+1254.359713867" Dec 16 13:07:48 crc kubenswrapper[4757]: I1216 13:07:48.947933 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-tjbz5" podStartSLOduration=3.12360581 podStartE2EDuration="1m10.947912845s" podCreationTimestamp="2025-12-16 13:06:38 +0000 UTC" firstStartedPulling="2025-12-16 13:06:39.876228441 +0000 UTC m=+1185.303972237" lastFinishedPulling="2025-12-16 13:07:47.700535476 +0000 UTC m=+1253.128279272" observedRunningTime="2025-12-16 13:07:48.944905009 +0000 UTC m=+1254.372648825" watchObservedRunningTime="2025-12-16 13:07:48.947912845 +0000 UTC m=+1254.375656641" Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.329542 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.516903 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-config-data\") pod \"082914b4-7f60-4d23-98ec-51f3c8a831aa\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.517379 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-fernet-keys\") pod \"082914b4-7f60-4d23-98ec-51f3c8a831aa\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.517883 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-scripts\") pod \"082914b4-7f60-4d23-98ec-51f3c8a831aa\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.517930 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-credential-keys\") pod \"082914b4-7f60-4d23-98ec-51f3c8a831aa\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.517957 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-combined-ca-bundle\") pod \"082914b4-7f60-4d23-98ec-51f3c8a831aa\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.517991 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2tkh\" (UniqueName: \"kubernetes.io/projected/082914b4-7f60-4d23-98ec-51f3c8a831aa-kube-api-access-c2tkh\") pod \"082914b4-7f60-4d23-98ec-51f3c8a831aa\" (UID: \"082914b4-7f60-4d23-98ec-51f3c8a831aa\") " Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.523592 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "082914b4-7f60-4d23-98ec-51f3c8a831aa" (UID: "082914b4-7f60-4d23-98ec-51f3c8a831aa"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.524240 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "082914b4-7f60-4d23-98ec-51f3c8a831aa" (UID: "082914b4-7f60-4d23-98ec-51f3c8a831aa"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.525747 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/082914b4-7f60-4d23-98ec-51f3c8a831aa-kube-api-access-c2tkh" (OuterVolumeSpecName: "kube-api-access-c2tkh") pod "082914b4-7f60-4d23-98ec-51f3c8a831aa" (UID: "082914b4-7f60-4d23-98ec-51f3c8a831aa"). InnerVolumeSpecName "kube-api-access-c2tkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.535953 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-scripts" (OuterVolumeSpecName: "scripts") pod "082914b4-7f60-4d23-98ec-51f3c8a831aa" (UID: "082914b4-7f60-4d23-98ec-51f3c8a831aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.554177 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "082914b4-7f60-4d23-98ec-51f3c8a831aa" (UID: "082914b4-7f60-4d23-98ec-51f3c8a831aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.564090 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-config-data" (OuterVolumeSpecName: "config-data") pod "082914b4-7f60-4d23-98ec-51f3c8a831aa" (UID: "082914b4-7f60-4d23-98ec-51f3c8a831aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.620695 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.620723 4757 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.620732 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.620742 4757 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.620751 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082914b4-7f60-4d23-98ec-51f3c8a831aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.620759 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2tkh\" (UniqueName: \"kubernetes.io/projected/082914b4-7f60-4d23-98ec-51f3c8a831aa-kube-api-access-c2tkh\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.888466 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kg598" event={"ID":"082914b4-7f60-4d23-98ec-51f3c8a831aa","Type":"ContainerDied","Data":"5b9feb52f7041becd238d54ea90fd4666e7207e9d14642dbd27bb45bd464620f"} Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.888520 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b9feb52f7041becd238d54ea90fd4666e7207e9d14642dbd27bb45bd464620f" Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.888567 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kg598" Dec 16 13:07:49 crc kubenswrapper[4757]: I1216 13:07:49.891745 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f22fc73-c034-4c9d-8274-215e0ef2a208","Type":"ContainerStarted","Data":"f8d93c404b1b656f408c4baae8b376e616731c4e5531a4ad8f1763546e23f59d"} Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.060267 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-66866d5f44-2mhtb"] Dec 16 13:07:50 crc kubenswrapper[4757]: E1216 13:07:50.060974 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082914b4-7f60-4d23-98ec-51f3c8a831aa" containerName="keystone-bootstrap" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.060996 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="082914b4-7f60-4d23-98ec-51f3c8a831aa" containerName="keystone-bootstrap" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.070089 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="082914b4-7f60-4d23-98ec-51f3c8a831aa" containerName="keystone-bootstrap" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.070817 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.079953 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.080192 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.080287 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.088821 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rtqbc" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.090846 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.091290 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.157436 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-combined-ca-bundle\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.157789 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2lfj\" (UniqueName: \"kubernetes.io/projected/eb176388-d71c-4d06-986d-f62cb0d86fe3-kube-api-access-q2lfj\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.157910 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-public-tls-certs\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.158216 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-scripts\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.172338 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-config-data\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.172608 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-credential-keys\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.172760 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-fernet-keys\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.172866 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-internal-tls-certs\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.208198 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-66866d5f44-2mhtb"] Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.274664 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-scripts\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.274743 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-config-data\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.274768 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-credential-keys\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.274796 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-fernet-keys\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.274821 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-internal-tls-certs\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.274898 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-combined-ca-bundle\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.274948 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2lfj\" (UniqueName: \"kubernetes.io/projected/eb176388-d71c-4d06-986d-f62cb0d86fe3-kube-api-access-q2lfj\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.274971 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-public-tls-certs\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.282757 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-config-data\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.283033 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-fernet-keys\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.283516 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-public-tls-certs\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.284264 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-scripts\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.293550 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-credential-keys\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.294629 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-internal-tls-certs\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.295191 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb176388-d71c-4d06-986d-f62cb0d86fe3-combined-ca-bundle\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.310619 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2lfj\" (UniqueName: \"kubernetes.io/projected/eb176388-d71c-4d06-986d-f62cb0d86fe3-kube-api-access-q2lfj\") pod \"keystone-66866d5f44-2mhtb\" (UID: \"eb176388-d71c-4d06-986d-f62cb0d86fe3\") " pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.460740 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.903120 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"da952307-39db-4816-8465-d931bd94436d","Type":"ContainerStarted","Data":"6090910b49339dc35f9500d38541fc4e7f62155a18bb86a432bc3ce4ec1caaa3"} Dec 16 13:07:50 crc kubenswrapper[4757]: I1216 13:07:50.944572 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-66866d5f44-2mhtb"] Dec 16 13:07:51 crc kubenswrapper[4757]: I1216 13:07:51.181627 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:07:51 crc kubenswrapper[4757]: I1216 13:07:51.181913 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:07:51 crc kubenswrapper[4757]: I1216 13:07:51.467917 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:07:51 crc kubenswrapper[4757]: I1216 13:07:51.468260 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:07:51 crc kubenswrapper[4757]: I1216 13:07:51.582368 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:07:51 crc kubenswrapper[4757]: I1216 13:07:51.582415 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:07:51 crc kubenswrapper[4757]: I1216 13:07:51.960469 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-66866d5f44-2mhtb" event={"ID":"eb176388-d71c-4d06-986d-f62cb0d86fe3","Type":"ContainerStarted","Data":"80b1dc501cbee58351800d1c321833b8f759469b512638c5c2562acbf34b8484"} Dec 16 13:07:51 crc kubenswrapper[4757]: I1216 13:07:51.961588 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-66866d5f44-2mhtb" event={"ID":"eb176388-d71c-4d06-986d-f62cb0d86fe3","Type":"ContainerStarted","Data":"13a7f7a54c35c9091074d2d7d35ee09495a44c185efb7cd455212e8bb61a82e0"} Dec 16 13:07:51 crc kubenswrapper[4757]: I1216 13:07:51.961621 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:07:51 crc kubenswrapper[4757]: I1216 13:07:51.983823 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f22fc73-c034-4c9d-8274-215e0ef2a208","Type":"ContainerStarted","Data":"49a74036aa1ba6dc9a0c5edfcbb46392c94d56a8c44ad631d4951f9cf16d3f78"} Dec 16 13:07:51 crc kubenswrapper[4757]: I1216 13:07:51.987484 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-66866d5f44-2mhtb" podStartSLOduration=1.9874563379999999 podStartE2EDuration="1.987456338s" podCreationTimestamp="2025-12-16 13:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:07:51.979291036 +0000 UTC m=+1257.407034832" watchObservedRunningTime="2025-12-16 13:07:51.987456338 +0000 UTC m=+1257.415200154" Dec 16 13:07:52 crc kubenswrapper[4757]: I1216 13:07:52.008385 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.008364893 podStartE2EDuration="7.008364893s" podCreationTimestamp="2025-12-16 13:07:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:07:52.0073246 +0000 UTC m=+1257.435068406" watchObservedRunningTime="2025-12-16 13:07:52.008364893 +0000 UTC m=+1257.436108689" Dec 16 13:07:56 crc kubenswrapper[4757]: I1216 13:07:56.019365 4757 generic.go:334] "Generic (PLEG): container finished" podID="25d57428-c378-4e57-87c5-f1fff2398cec" containerID="f20ef5e006c72abb7a0a7a3f3c95380816bf64afd8ac9ee819347f5406e6e010" exitCode=0 Dec 16 13:07:56 crc kubenswrapper[4757]: I1216 13:07:56.019905 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-86cds" event={"ID":"25d57428-c378-4e57-87c5-f1fff2398cec","Type":"ContainerDied","Data":"f20ef5e006c72abb7a0a7a3f3c95380816bf64afd8ac9ee819347f5406e6e010"} Dec 16 13:07:56 crc kubenswrapper[4757]: I1216 13:07:56.025201 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32bc2074-3d53-44b4-8e6c-c500a2617944","Type":"ContainerStarted","Data":"0ef145acfbfafa8bd433cce6a05108d27b07bea77164b78afc87588a8d6595b5"} Dec 16 13:07:56 crc kubenswrapper[4757]: I1216 13:07:56.043243 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=11.043224759 podStartE2EDuration="11.043224759s" podCreationTimestamp="2025-12-16 13:07:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:07:52.05138749 +0000 UTC m=+1257.479131296" watchObservedRunningTime="2025-12-16 13:07:56.043224759 +0000 UTC m=+1261.470968555" Dec 16 13:07:56 crc kubenswrapper[4757]: I1216 13:07:56.202744 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 13:07:56 crc kubenswrapper[4757]: I1216 13:07:56.203737 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 13:07:56 crc kubenswrapper[4757]: I1216 13:07:56.238031 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 13:07:56 crc kubenswrapper[4757]: I1216 13:07:56.238391 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 13:07:56 crc kubenswrapper[4757]: I1216 13:07:56.241126 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 13:07:56 crc kubenswrapper[4757]: I1216 13:07:56.270330 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 13:07:56 crc kubenswrapper[4757]: I1216 13:07:56.287410 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 13:07:56 crc kubenswrapper[4757]: I1216 13:07:56.300241 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.037171 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.037208 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.037333 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.037346 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.460131 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-86cds" Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.529559 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d57428-c378-4e57-87c5-f1fff2398cec-combined-ca-bundle\") pod \"25d57428-c378-4e57-87c5-f1fff2398cec\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.529638 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d57428-c378-4e57-87c5-f1fff2398cec-logs\") pod \"25d57428-c378-4e57-87c5-f1fff2398cec\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.529691 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d57428-c378-4e57-87c5-f1fff2398cec-config-data\") pod \"25d57428-c378-4e57-87c5-f1fff2398cec\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.529841 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d57428-c378-4e57-87c5-f1fff2398cec-scripts\") pod \"25d57428-c378-4e57-87c5-f1fff2398cec\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.529875 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25lt5\" (UniqueName: \"kubernetes.io/projected/25d57428-c378-4e57-87c5-f1fff2398cec-kube-api-access-25lt5\") pod \"25d57428-c378-4e57-87c5-f1fff2398cec\" (UID: \"25d57428-c378-4e57-87c5-f1fff2398cec\") " Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.530308 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25d57428-c378-4e57-87c5-f1fff2398cec-logs" (OuterVolumeSpecName: "logs") pod "25d57428-c378-4e57-87c5-f1fff2398cec" (UID: "25d57428-c378-4e57-87c5-f1fff2398cec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.538119 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d57428-c378-4e57-87c5-f1fff2398cec-scripts" (OuterVolumeSpecName: "scripts") pod "25d57428-c378-4e57-87c5-f1fff2398cec" (UID: "25d57428-c378-4e57-87c5-f1fff2398cec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.561192 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d57428-c378-4e57-87c5-f1fff2398cec-kube-api-access-25lt5" (OuterVolumeSpecName: "kube-api-access-25lt5") pod "25d57428-c378-4e57-87c5-f1fff2398cec" (UID: "25d57428-c378-4e57-87c5-f1fff2398cec"). InnerVolumeSpecName "kube-api-access-25lt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.570676 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d57428-c378-4e57-87c5-f1fff2398cec-config-data" (OuterVolumeSpecName: "config-data") pod "25d57428-c378-4e57-87c5-f1fff2398cec" (UID: "25d57428-c378-4e57-87c5-f1fff2398cec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.572904 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d57428-c378-4e57-87c5-f1fff2398cec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25d57428-c378-4e57-87c5-f1fff2398cec" (UID: "25d57428-c378-4e57-87c5-f1fff2398cec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.632812 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d57428-c378-4e57-87c5-f1fff2398cec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.632857 4757 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d57428-c378-4e57-87c5-f1fff2398cec-logs\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.632869 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d57428-c378-4e57-87c5-f1fff2398cec-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.632879 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25d57428-c378-4e57-87c5-f1fff2398cec-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:57 crc kubenswrapper[4757]: I1216 13:07:57.632889 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25lt5\" (UniqueName: \"kubernetes.io/projected/25d57428-c378-4e57-87c5-f1fff2398cec-kube-api-access-25lt5\") on node \"crc\" DevicePath \"\"" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.051675 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-86cds" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.051692 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-86cds" event={"ID":"25d57428-c378-4e57-87c5-f1fff2398cec","Type":"ContainerDied","Data":"b2637d8efcd2a610a4942f335c526c5c2a2391b7f76e50fb356f94e260d09867"} Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.051741 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2637d8efcd2a610a4942f335c526c5c2a2391b7f76e50fb356f94e260d09867" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.068158 4757 generic.go:334] "Generic (PLEG): container finished" podID="362eaecb-4139-44f9-a651-3e14cc2d6ae2" containerID="5e198d54c33d19283c1b227ff4c28be1278e25af8630e271c1f46e46d1980127" exitCode=0 Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.068581 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tjbz5" event={"ID":"362eaecb-4139-44f9-a651-3e14cc2d6ae2","Type":"ContainerDied","Data":"5e198d54c33d19283c1b227ff4c28be1278e25af8630e271c1f46e46d1980127"} Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.166947 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5494d9c5f6-8dwpv"] Dec 16 13:07:58 crc kubenswrapper[4757]: E1216 13:07:58.167764 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d57428-c378-4e57-87c5-f1fff2398cec" containerName="placement-db-sync" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.167788 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d57428-c378-4e57-87c5-f1fff2398cec" containerName="placement-db-sync" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.167967 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d57428-c378-4e57-87c5-f1fff2398cec" containerName="placement-db-sync" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.175149 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.181693 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.181749 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2hwt5" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.181899 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.181919 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.182098 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.201798 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5494d9c5f6-8dwpv"] Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.252095 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ba167f-6c35-410d-b690-1083c5a482ae-public-tls-certs\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.252175 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ba167f-6c35-410d-b690-1083c5a482ae-internal-tls-certs\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.252249 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ba167f-6c35-410d-b690-1083c5a482ae-logs\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.252404 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwzzg\" (UniqueName: \"kubernetes.io/projected/e8ba167f-6c35-410d-b690-1083c5a482ae-kube-api-access-xwzzg\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.252504 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ba167f-6c35-410d-b690-1083c5a482ae-config-data\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.252539 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ba167f-6c35-410d-b690-1083c5a482ae-combined-ca-bundle\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.252623 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ba167f-6c35-410d-b690-1083c5a482ae-scripts\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.354151 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwzzg\" (UniqueName: \"kubernetes.io/projected/e8ba167f-6c35-410d-b690-1083c5a482ae-kube-api-access-xwzzg\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.354594 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ba167f-6c35-410d-b690-1083c5a482ae-config-data\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.354627 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ba167f-6c35-410d-b690-1083c5a482ae-combined-ca-bundle\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.354704 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ba167f-6c35-410d-b690-1083c5a482ae-scripts\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.354786 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ba167f-6c35-410d-b690-1083c5a482ae-public-tls-certs\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.354857 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ba167f-6c35-410d-b690-1083c5a482ae-internal-tls-certs\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.354982 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ba167f-6c35-410d-b690-1083c5a482ae-logs\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.355372 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ba167f-6c35-410d-b690-1083c5a482ae-logs\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.362959 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ba167f-6c35-410d-b690-1083c5a482ae-config-data\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.365028 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ba167f-6c35-410d-b690-1083c5a482ae-public-tls-certs\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.365553 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ba167f-6c35-410d-b690-1083c5a482ae-scripts\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.365630 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ba167f-6c35-410d-b690-1083c5a482ae-internal-tls-certs\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.367714 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ba167f-6c35-410d-b690-1083c5a482ae-combined-ca-bundle\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.386181 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwzzg\" (UniqueName: \"kubernetes.io/projected/e8ba167f-6c35-410d-b690-1083c5a482ae-kube-api-access-xwzzg\") pod \"placement-5494d9c5f6-8dwpv\" (UID: \"e8ba167f-6c35-410d-b690-1083c5a482ae\") " pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:58 crc kubenswrapper[4757]: I1216 13:07:58.498144 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:07:59 crc kubenswrapper[4757]: I1216 13:07:59.080941 4757 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 13:07:59 crc kubenswrapper[4757]: I1216 13:07:59.082500 4757 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 13:07:59 crc kubenswrapper[4757]: I1216 13:07:59.082458 4757 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 13:07:59 crc kubenswrapper[4757]: I1216 13:07:59.083234 4757 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 13:07:59 crc kubenswrapper[4757]: I1216 13:07:59.148802 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5494d9c5f6-8dwpv"] Dec 16 13:07:59 crc kubenswrapper[4757]: W1216 13:07:59.163929 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8ba167f_6c35_410d_b690_1083c5a482ae.slice/crio-cf0927b40cdcda6ad2dc410d6bc5bbf77bd80f6087bb267bdf2f562f32bc7a45 WatchSource:0}: Error finding container cf0927b40cdcda6ad2dc410d6bc5bbf77bd80f6087bb267bdf2f562f32bc7a45: Status 404 returned error can't find the container with id cf0927b40cdcda6ad2dc410d6bc5bbf77bd80f6087bb267bdf2f562f32bc7a45 Dec 16 13:08:00 crc kubenswrapper[4757]: I1216 13:08:00.089039 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5494d9c5f6-8dwpv" event={"ID":"e8ba167f-6c35-410d-b690-1083c5a482ae","Type":"ContainerStarted","Data":"cf0927b40cdcda6ad2dc410d6bc5bbf77bd80f6087bb267bdf2f562f32bc7a45"} Dec 16 13:08:00 crc kubenswrapper[4757]: I1216 13:08:00.503258 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tjbz5" Dec 16 13:08:00 crc kubenswrapper[4757]: I1216 13:08:00.603797 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362eaecb-4139-44f9-a651-3e14cc2d6ae2-combined-ca-bundle\") pod \"362eaecb-4139-44f9-a651-3e14cc2d6ae2\" (UID: \"362eaecb-4139-44f9-a651-3e14cc2d6ae2\") " Dec 16 13:08:00 crc kubenswrapper[4757]: I1216 13:08:00.603970 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/362eaecb-4139-44f9-a651-3e14cc2d6ae2-db-sync-config-data\") pod \"362eaecb-4139-44f9-a651-3e14cc2d6ae2\" (UID: \"362eaecb-4139-44f9-a651-3e14cc2d6ae2\") " Dec 16 13:08:00 crc kubenswrapper[4757]: I1216 13:08:00.604103 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w9r5\" (UniqueName: \"kubernetes.io/projected/362eaecb-4139-44f9-a651-3e14cc2d6ae2-kube-api-access-5w9r5\") pod \"362eaecb-4139-44f9-a651-3e14cc2d6ae2\" (UID: \"362eaecb-4139-44f9-a651-3e14cc2d6ae2\") " Dec 16 13:08:00 crc kubenswrapper[4757]: I1216 13:08:00.611844 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/362eaecb-4139-44f9-a651-3e14cc2d6ae2-kube-api-access-5w9r5" (OuterVolumeSpecName: "kube-api-access-5w9r5") pod "362eaecb-4139-44f9-a651-3e14cc2d6ae2" (UID: "362eaecb-4139-44f9-a651-3e14cc2d6ae2"). InnerVolumeSpecName "kube-api-access-5w9r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:00 crc kubenswrapper[4757]: I1216 13:08:00.613324 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/362eaecb-4139-44f9-a651-3e14cc2d6ae2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "362eaecb-4139-44f9-a651-3e14cc2d6ae2" (UID: "362eaecb-4139-44f9-a651-3e14cc2d6ae2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:00 crc kubenswrapper[4757]: I1216 13:08:00.637338 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/362eaecb-4139-44f9-a651-3e14cc2d6ae2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "362eaecb-4139-44f9-a651-3e14cc2d6ae2" (UID: "362eaecb-4139-44f9-a651-3e14cc2d6ae2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:00 crc kubenswrapper[4757]: I1216 13:08:00.706504 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/362eaecb-4139-44f9-a651-3e14cc2d6ae2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:00 crc kubenswrapper[4757]: I1216 13:08:00.706546 4757 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/362eaecb-4139-44f9-a651-3e14cc2d6ae2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:00 crc kubenswrapper[4757]: I1216 13:08:00.706555 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w9r5\" (UniqueName: \"kubernetes.io/projected/362eaecb-4139-44f9-a651-3e14cc2d6ae2-kube-api-access-5w9r5\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.107100 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tjbz5" event={"ID":"362eaecb-4139-44f9-a651-3e14cc2d6ae2","Type":"ContainerDied","Data":"35b8c3889adb1686b338f16a57cc08b1c2746bf3764724b5fbca08bd0b93b185"} Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.107356 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35b8c3889adb1686b338f16a57cc08b1c2746bf3764724b5fbca08bd0b93b185" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.107310 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tjbz5" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.109895 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5494d9c5f6-8dwpv" event={"ID":"e8ba167f-6c35-410d-b690-1083c5a482ae","Type":"ContainerStarted","Data":"7de49ffa90cb19f1856ba91d706a4725210d37f939620f9487453cb2ebcad152"} Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.109952 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5494d9c5f6-8dwpv" event={"ID":"e8ba167f-6c35-410d-b690-1083c5a482ae","Type":"ContainerStarted","Data":"391482051cc195f0925dbaad2ebe0829f3fe6742635de618e98ad61bdbcd34ef"} Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.111637 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.112339 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.140823 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5494d9c5f6-8dwpv" podStartSLOduration=3.140783577 podStartE2EDuration="3.140783577s" podCreationTimestamp="2025-12-16 13:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:08:01.132400221 +0000 UTC m=+1266.560144017" watchObservedRunningTime="2025-12-16 13:08:01.140783577 +0000 UTC m=+1266.568527373" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.470652 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-75ccc7d896-jmrk9" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.589056 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5d66ddf65b-lmltr" podUID="65337bd1-c674-4817-91c2-ad150639205c" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.765239 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-9bd99879-lw2rw"] Dec 16 13:08:01 crc kubenswrapper[4757]: E1216 13:08:01.774063 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="362eaecb-4139-44f9-a651-3e14cc2d6ae2" containerName="barbican-db-sync" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.774089 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="362eaecb-4139-44f9-a651-3e14cc2d6ae2" containerName="barbican-db-sync" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.774303 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="362eaecb-4139-44f9-a651-3e14cc2d6ae2" containerName="barbican-db-sync" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.775193 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-9bd99879-lw2rw" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.789478 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-l49sv" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.806149 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.816094 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-9bd99879-lw2rw"] Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.828104 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d1df7bf-6c39-4e49-873a-701b8c05f900-config-data\") pod \"barbican-worker-9bd99879-lw2rw\" (UID: \"7d1df7bf-6c39-4e49-873a-701b8c05f900\") " pod="openstack/barbican-worker-9bd99879-lw2rw" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.828203 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d1df7bf-6c39-4e49-873a-701b8c05f900-config-data-custom\") pod \"barbican-worker-9bd99879-lw2rw\" (UID: \"7d1df7bf-6c39-4e49-873a-701b8c05f900\") " pod="openstack/barbican-worker-9bd99879-lw2rw" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.828320 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d1df7bf-6c39-4e49-873a-701b8c05f900-combined-ca-bundle\") pod \"barbican-worker-9bd99879-lw2rw\" (UID: \"7d1df7bf-6c39-4e49-873a-701b8c05f900\") " pod="openstack/barbican-worker-9bd99879-lw2rw" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.828356 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjtjw\" (UniqueName: \"kubernetes.io/projected/7d1df7bf-6c39-4e49-873a-701b8c05f900-kube-api-access-qjtjw\") pod \"barbican-worker-9bd99879-lw2rw\" (UID: \"7d1df7bf-6c39-4e49-873a-701b8c05f900\") " pod="openstack/barbican-worker-9bd99879-lw2rw" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.828443 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d1df7bf-6c39-4e49-873a-701b8c05f900-logs\") pod \"barbican-worker-9bd99879-lw2rw\" (UID: \"7d1df7bf-6c39-4e49-873a-701b8c05f900\") " pod="openstack/barbican-worker-9bd99879-lw2rw" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.840749 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.843727 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5d9fbfd57d-79dn2"] Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.845191 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.848286 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.882203 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d9fbfd57d-79dn2"] Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.932052 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d1df7bf-6c39-4e49-873a-701b8c05f900-combined-ca-bundle\") pod \"barbican-worker-9bd99879-lw2rw\" (UID: \"7d1df7bf-6c39-4e49-873a-701b8c05f900\") " pod="openstack/barbican-worker-9bd99879-lw2rw" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.932098 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjtjw\" (UniqueName: \"kubernetes.io/projected/7d1df7bf-6c39-4e49-873a-701b8c05f900-kube-api-access-qjtjw\") pod \"barbican-worker-9bd99879-lw2rw\" (UID: \"7d1df7bf-6c39-4e49-873a-701b8c05f900\") " pod="openstack/barbican-worker-9bd99879-lw2rw" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.932137 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25963bc5-afd1-4703-a583-df0d8094117d-combined-ca-bundle\") pod \"barbican-keystone-listener-5d9fbfd57d-79dn2\" (UID: \"25963bc5-afd1-4703-a583-df0d8094117d\") " pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.932190 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d1df7bf-6c39-4e49-873a-701b8c05f900-logs\") pod \"barbican-worker-9bd99879-lw2rw\" (UID: \"7d1df7bf-6c39-4e49-873a-701b8c05f900\") " pod="openstack/barbican-worker-9bd99879-lw2rw" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.932213 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25963bc5-afd1-4703-a583-df0d8094117d-config-data\") pod \"barbican-keystone-listener-5d9fbfd57d-79dn2\" (UID: \"25963bc5-afd1-4703-a583-df0d8094117d\") " pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.932241 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d1df7bf-6c39-4e49-873a-701b8c05f900-config-data\") pod \"barbican-worker-9bd99879-lw2rw\" (UID: \"7d1df7bf-6c39-4e49-873a-701b8c05f900\") " pod="openstack/barbican-worker-9bd99879-lw2rw" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.932274 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25963bc5-afd1-4703-a583-df0d8094117d-config-data-custom\") pod \"barbican-keystone-listener-5d9fbfd57d-79dn2\" (UID: \"25963bc5-afd1-4703-a583-df0d8094117d\") " pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.932320 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d1df7bf-6c39-4e49-873a-701b8c05f900-config-data-custom\") pod \"barbican-worker-9bd99879-lw2rw\" (UID: \"7d1df7bf-6c39-4e49-873a-701b8c05f900\") " pod="openstack/barbican-worker-9bd99879-lw2rw" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.932349 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd2bt\" (UniqueName: \"kubernetes.io/projected/25963bc5-afd1-4703-a583-df0d8094117d-kube-api-access-jd2bt\") pod \"barbican-keystone-listener-5d9fbfd57d-79dn2\" (UID: \"25963bc5-afd1-4703-a583-df0d8094117d\") " pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.932392 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25963bc5-afd1-4703-a583-df0d8094117d-logs\") pod \"barbican-keystone-listener-5d9fbfd57d-79dn2\" (UID: \"25963bc5-afd1-4703-a583-df0d8094117d\") " pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.934291 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d1df7bf-6c39-4e49-873a-701b8c05f900-logs\") pod \"barbican-worker-9bd99879-lw2rw\" (UID: \"7d1df7bf-6c39-4e49-873a-701b8c05f900\") " pod="openstack/barbican-worker-9bd99879-lw2rw" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.939514 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-7fn6m"] Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.966931 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d1df7bf-6c39-4e49-873a-701b8c05f900-config-data\") pod \"barbican-worker-9bd99879-lw2rw\" (UID: \"7d1df7bf-6c39-4e49-873a-701b8c05f900\") " pod="openstack/barbican-worker-9bd99879-lw2rw" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.967642 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d1df7bf-6c39-4e49-873a-701b8c05f900-config-data-custom\") pod \"barbican-worker-9bd99879-lw2rw\" (UID: \"7d1df7bf-6c39-4e49-873a-701b8c05f900\") " pod="openstack/barbican-worker-9bd99879-lw2rw" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.968768 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.974679 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d1df7bf-6c39-4e49-873a-701b8c05f900-combined-ca-bundle\") pod \"barbican-worker-9bd99879-lw2rw\" (UID: \"7d1df7bf-6c39-4e49-873a-701b8c05f900\") " pod="openstack/barbican-worker-9bd99879-lw2rw" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.974727 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjtjw\" (UniqueName: \"kubernetes.io/projected/7d1df7bf-6c39-4e49-873a-701b8c05f900-kube-api-access-qjtjw\") pod \"barbican-worker-9bd99879-lw2rw\" (UID: \"7d1df7bf-6c39-4e49-873a-701b8c05f900\") " pod="openstack/barbican-worker-9bd99879-lw2rw" Dec 16 13:08:01 crc kubenswrapper[4757]: I1216 13:08:01.975901 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-7fn6m"] Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.034592 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25963bc5-afd1-4703-a583-df0d8094117d-config-data\") pod \"barbican-keystone-listener-5d9fbfd57d-79dn2\" (UID: \"25963bc5-afd1-4703-a583-df0d8094117d\") " pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.034642 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-config\") pod \"dnsmasq-dns-7c67bffd47-7fn6m\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.034673 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-dns-swift-storage-0\") pod \"dnsmasq-dns-7c67bffd47-7fn6m\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.034707 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7qtr\" (UniqueName: \"kubernetes.io/projected/331a02af-2a7a-4a29-9709-ed1a45fa8046-kube-api-access-p7qtr\") pod \"dnsmasq-dns-7c67bffd47-7fn6m\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.034744 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25963bc5-afd1-4703-a583-df0d8094117d-config-data-custom\") pod \"barbican-keystone-listener-5d9fbfd57d-79dn2\" (UID: \"25963bc5-afd1-4703-a583-df0d8094117d\") " pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.034771 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-dns-svc\") pod \"dnsmasq-dns-7c67bffd47-7fn6m\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.034806 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-ovsdbserver-nb\") pod \"dnsmasq-dns-7c67bffd47-7fn6m\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.034857 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd2bt\" (UniqueName: \"kubernetes.io/projected/25963bc5-afd1-4703-a583-df0d8094117d-kube-api-access-jd2bt\") pod \"barbican-keystone-listener-5d9fbfd57d-79dn2\" (UID: \"25963bc5-afd1-4703-a583-df0d8094117d\") " pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.034892 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25963bc5-afd1-4703-a583-df0d8094117d-logs\") pod \"barbican-keystone-listener-5d9fbfd57d-79dn2\" (UID: \"25963bc5-afd1-4703-a583-df0d8094117d\") " pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.034971 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25963bc5-afd1-4703-a583-df0d8094117d-combined-ca-bundle\") pod \"barbican-keystone-listener-5d9fbfd57d-79dn2\" (UID: \"25963bc5-afd1-4703-a583-df0d8094117d\") " pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.035105 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-ovsdbserver-sb\") pod \"dnsmasq-dns-7c67bffd47-7fn6m\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.040167 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25963bc5-afd1-4703-a583-df0d8094117d-logs\") pod \"barbican-keystone-listener-5d9fbfd57d-79dn2\" (UID: \"25963bc5-afd1-4703-a583-df0d8094117d\") " pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.045214 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25963bc5-afd1-4703-a583-df0d8094117d-config-data\") pod \"barbican-keystone-listener-5d9fbfd57d-79dn2\" (UID: \"25963bc5-afd1-4703-a583-df0d8094117d\") " pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.062739 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25963bc5-afd1-4703-a583-df0d8094117d-combined-ca-bundle\") pod \"barbican-keystone-listener-5d9fbfd57d-79dn2\" (UID: \"25963bc5-afd1-4703-a583-df0d8094117d\") " pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.068150 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25963bc5-afd1-4703-a583-df0d8094117d-config-data-custom\") pod \"barbican-keystone-listener-5d9fbfd57d-79dn2\" (UID: \"25963bc5-afd1-4703-a583-df0d8094117d\") " pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.076253 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd2bt\" (UniqueName: \"kubernetes.io/projected/25963bc5-afd1-4703-a583-df0d8094117d-kube-api-access-jd2bt\") pod \"barbican-keystone-listener-5d9fbfd57d-79dn2\" (UID: \"25963bc5-afd1-4703-a583-df0d8094117d\") " pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.137084 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-ovsdbserver-sb\") pod \"dnsmasq-dns-7c67bffd47-7fn6m\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.137176 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-config\") pod \"dnsmasq-dns-7c67bffd47-7fn6m\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.137205 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-dns-swift-storage-0\") pod \"dnsmasq-dns-7c67bffd47-7fn6m\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.137249 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7qtr\" (UniqueName: \"kubernetes.io/projected/331a02af-2a7a-4a29-9709-ed1a45fa8046-kube-api-access-p7qtr\") pod \"dnsmasq-dns-7c67bffd47-7fn6m\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.137291 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-dns-svc\") pod \"dnsmasq-dns-7c67bffd47-7fn6m\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.137322 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-ovsdbserver-nb\") pod \"dnsmasq-dns-7c67bffd47-7fn6m\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.138719 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-ovsdbserver-nb\") pod \"dnsmasq-dns-7c67bffd47-7fn6m\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.138759 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-dns-swift-storage-0\") pod \"dnsmasq-dns-7c67bffd47-7fn6m\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.139300 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-ovsdbserver-sb\") pod \"dnsmasq-dns-7c67bffd47-7fn6m\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.139784 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-dns-svc\") pod \"dnsmasq-dns-7c67bffd47-7fn6m\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.139981 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-config\") pod \"dnsmasq-dns-7c67bffd47-7fn6m\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.150428 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-9bd99879-lw2rw" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.153657 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-57f49b544b-vlhzn"] Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.155994 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.180386 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.184334 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.201641 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7qtr\" (UniqueName: \"kubernetes.io/projected/331a02af-2a7a-4a29-9709-ed1a45fa8046-kube-api-access-p7qtr\") pod \"dnsmasq-dns-7c67bffd47-7fn6m\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.209848 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57f49b544b-vlhzn"] Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.241530 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3687417c-3bcd-48ea-b328-440ff4005a02-config-data-custom\") pod \"barbican-api-57f49b544b-vlhzn\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.241945 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3687417c-3bcd-48ea-b328-440ff4005a02-combined-ca-bundle\") pod \"barbican-api-57f49b544b-vlhzn\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.242067 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2c2z\" (UniqueName: \"kubernetes.io/projected/3687417c-3bcd-48ea-b328-440ff4005a02-kube-api-access-n2c2z\") pod \"barbican-api-57f49b544b-vlhzn\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.242109 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3687417c-3bcd-48ea-b328-440ff4005a02-logs\") pod \"barbican-api-57f49b544b-vlhzn\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.242156 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3687417c-3bcd-48ea-b328-440ff4005a02-config-data\") pod \"barbican-api-57f49b544b-vlhzn\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.343404 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3687417c-3bcd-48ea-b328-440ff4005a02-combined-ca-bundle\") pod \"barbican-api-57f49b544b-vlhzn\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.343496 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2c2z\" (UniqueName: \"kubernetes.io/projected/3687417c-3bcd-48ea-b328-440ff4005a02-kube-api-access-n2c2z\") pod \"barbican-api-57f49b544b-vlhzn\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.343532 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3687417c-3bcd-48ea-b328-440ff4005a02-logs\") pod \"barbican-api-57f49b544b-vlhzn\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.343575 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3687417c-3bcd-48ea-b328-440ff4005a02-config-data\") pod \"barbican-api-57f49b544b-vlhzn\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.343622 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3687417c-3bcd-48ea-b328-440ff4005a02-config-data-custom\") pod \"barbican-api-57f49b544b-vlhzn\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.344425 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3687417c-3bcd-48ea-b328-440ff4005a02-logs\") pod \"barbican-api-57f49b544b-vlhzn\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.346998 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3687417c-3bcd-48ea-b328-440ff4005a02-combined-ca-bundle\") pod \"barbican-api-57f49b544b-vlhzn\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.348718 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3687417c-3bcd-48ea-b328-440ff4005a02-config-data-custom\") pod \"barbican-api-57f49b544b-vlhzn\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.352775 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3687417c-3bcd-48ea-b328-440ff4005a02-config-data\") pod \"barbican-api-57f49b544b-vlhzn\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.366089 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2c2z\" (UniqueName: \"kubernetes.io/projected/3687417c-3bcd-48ea-b328-440ff4005a02-kube-api-access-n2c2z\") pod \"barbican-api-57f49b544b-vlhzn\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.483903 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:02 crc kubenswrapper[4757]: I1216 13:08:02.517754 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:04 crc kubenswrapper[4757]: I1216 13:08:04.814676 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 13:08:04 crc kubenswrapper[4757]: I1216 13:08:04.815425 4757 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 13:08:04 crc kubenswrapper[4757]: I1216 13:08:04.820811 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 13:08:04 crc kubenswrapper[4757]: I1216 13:08:04.849426 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 13:08:04 crc kubenswrapper[4757]: I1216 13:08:04.849554 4757 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 13:08:04 crc kubenswrapper[4757]: I1216 13:08:04.868164 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 13:08:05 crc kubenswrapper[4757]: I1216 13:08:05.277400 4757 generic.go:334] "Generic (PLEG): container finished" podID="e48bb858-bd6e-4dbc-a17a-5fd5e1275e00" containerID="0e351d201b95334e733de4822ee4dfa2d43cbb85bf172b683744b901cb0cd0e8" exitCode=0 Dec 16 13:08:05 crc kubenswrapper[4757]: I1216 13:08:05.277669 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7lhwd" event={"ID":"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00","Type":"ContainerDied","Data":"0e351d201b95334e733de4822ee4dfa2d43cbb85bf172b683744b901cb0cd0e8"} Dec 16 13:08:06 crc kubenswrapper[4757]: I1216 13:08:06.307597 4757 generic.go:334] "Generic (PLEG): container finished" podID="fc9a7054-7c7f-4e36-8d57-e095087a7878" containerID="e432176b39a460d47e7d77399b0f4c007df8990074a5a85f35961c0774cacecc" exitCode=0 Dec 16 13:08:06 crc kubenswrapper[4757]: I1216 13:08:06.308441 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pcl59" event={"ID":"fc9a7054-7c7f-4e36-8d57-e095087a7878","Type":"ContainerDied","Data":"e432176b39a460d47e7d77399b0f4c007df8990074a5a85f35961c0774cacecc"} Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.023253 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7cfff6bfd-qz5sk"] Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.032864 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.036714 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.040256 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.117846 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cfff6bfd-qz5sk"] Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.153083 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-combined-ca-bundle\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.153272 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-logs\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.153313 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-public-tls-certs\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.153479 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjkbq\" (UniqueName: \"kubernetes.io/projected/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-kube-api-access-gjkbq\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.153594 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-internal-tls-certs\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.153656 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-config-data-custom\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.153767 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-config-data\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.255248 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-logs\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.255306 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-public-tls-certs\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.255395 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjkbq\" (UniqueName: \"kubernetes.io/projected/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-kube-api-access-gjkbq\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.255451 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-internal-tls-certs\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.255493 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-config-data-custom\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.255572 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-config-data\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.255609 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-combined-ca-bundle\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.256739 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-logs\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.266871 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-combined-ca-bundle\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.273814 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-config-data-custom\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.273903 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-internal-tls-certs\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.275782 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-public-tls-certs\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.286243 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjkbq\" (UniqueName: \"kubernetes.io/projected/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-kube-api-access-gjkbq\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.286377 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd29da8f-05a6-43a9-a943-c6a8a4ef8479-config-data\") pod \"barbican-api-7cfff6bfd-qz5sk\" (UID: \"fd29da8f-05a6-43a9-a943-c6a8a4ef8479\") " pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:07 crc kubenswrapper[4757]: I1216 13:08:07.362311 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.717162 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.726830 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pcl59" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.779430 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-db-sync-config-data\") pod \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.779698 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-combined-ca-bundle\") pod \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.779941 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpnll\" (UniqueName: \"kubernetes.io/projected/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-kube-api-access-bpnll\") pod \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.780181 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-scripts\") pod \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.780283 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-etc-machine-id\") pod \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.780626 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-config-data\") pod \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\" (UID: \"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00\") " Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.787299 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-kube-api-access-bpnll" (OuterVolumeSpecName: "kube-api-access-bpnll") pod "e48bb858-bd6e-4dbc-a17a-5fd5e1275e00" (UID: "e48bb858-bd6e-4dbc-a17a-5fd5e1275e00"). InnerVolumeSpecName "kube-api-access-bpnll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.789037 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e48bb858-bd6e-4dbc-a17a-5fd5e1275e00" (UID: "e48bb858-bd6e-4dbc-a17a-5fd5e1275e00"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.790407 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e48bb858-bd6e-4dbc-a17a-5fd5e1275e00" (UID: "e48bb858-bd6e-4dbc-a17a-5fd5e1275e00"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.796830 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-scripts" (OuterVolumeSpecName: "scripts") pod "e48bb858-bd6e-4dbc-a17a-5fd5e1275e00" (UID: "e48bb858-bd6e-4dbc-a17a-5fd5e1275e00"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.819589 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e48bb858-bd6e-4dbc-a17a-5fd5e1275e00" (UID: "e48bb858-bd6e-4dbc-a17a-5fd5e1275e00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.865311 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-config-data" (OuterVolumeSpecName: "config-data") pod "e48bb858-bd6e-4dbc-a17a-5fd5e1275e00" (UID: "e48bb858-bd6e-4dbc-a17a-5fd5e1275e00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.882301 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc9a7054-7c7f-4e36-8d57-e095087a7878-config\") pod \"fc9a7054-7c7f-4e36-8d57-e095087a7878\" (UID: \"fc9a7054-7c7f-4e36-8d57-e095087a7878\") " Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.882429 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc9a7054-7c7f-4e36-8d57-e095087a7878-combined-ca-bundle\") pod \"fc9a7054-7c7f-4e36-8d57-e095087a7878\" (UID: \"fc9a7054-7c7f-4e36-8d57-e095087a7878\") " Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.882514 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsmk8\" (UniqueName: \"kubernetes.io/projected/fc9a7054-7c7f-4e36-8d57-e095087a7878-kube-api-access-jsmk8\") pod \"fc9a7054-7c7f-4e36-8d57-e095087a7878\" (UID: \"fc9a7054-7c7f-4e36-8d57-e095087a7878\") " Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.883321 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.883343 4757 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.883351 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.883360 4757 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.883371 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.883379 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpnll\" (UniqueName: \"kubernetes.io/projected/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00-kube-api-access-bpnll\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.889779 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc9a7054-7c7f-4e36-8d57-e095087a7878-kube-api-access-jsmk8" (OuterVolumeSpecName: "kube-api-access-jsmk8") pod "fc9a7054-7c7f-4e36-8d57-e095087a7878" (UID: "fc9a7054-7c7f-4e36-8d57-e095087a7878"). InnerVolumeSpecName "kube-api-access-jsmk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.924200 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc9a7054-7c7f-4e36-8d57-e095087a7878-config" (OuterVolumeSpecName: "config") pod "fc9a7054-7c7f-4e36-8d57-e095087a7878" (UID: "fc9a7054-7c7f-4e36-8d57-e095087a7878"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.927821 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc9a7054-7c7f-4e36-8d57-e095087a7878-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc9a7054-7c7f-4e36-8d57-e095087a7878" (UID: "fc9a7054-7c7f-4e36-8d57-e095087a7878"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.985713 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsmk8\" (UniqueName: \"kubernetes.io/projected/fc9a7054-7c7f-4e36-8d57-e095087a7878-kube-api-access-jsmk8\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.985764 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc9a7054-7c7f-4e36-8d57-e095087a7878-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:08 crc kubenswrapper[4757]: I1216 13:08:08.985776 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc9a7054-7c7f-4e36-8d57-e095087a7878-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:09 crc kubenswrapper[4757]: I1216 13:08:09.338738 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-pcl59" event={"ID":"fc9a7054-7c7f-4e36-8d57-e095087a7878","Type":"ContainerDied","Data":"2c2d9ed917162169e453b25dfddd2dc9e28edcbdc6eeb816a63b02fa65224627"} Dec 16 13:08:09 crc kubenswrapper[4757]: I1216 13:08:09.339091 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c2d9ed917162169e453b25dfddd2dc9e28edcbdc6eeb816a63b02fa65224627" Dec 16 13:08:09 crc kubenswrapper[4757]: I1216 13:08:09.338975 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-pcl59" Dec 16 13:08:09 crc kubenswrapper[4757]: I1216 13:08:09.341949 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7lhwd" event={"ID":"e48bb858-bd6e-4dbc-a17a-5fd5e1275e00","Type":"ContainerDied","Data":"cf72460b542b71a19f17bbda38b33db4f696dd3995f9ac76e1fdfac04b32a377"} Dec 16 13:08:09 crc kubenswrapper[4757]: I1216 13:08:09.341992 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf72460b542b71a19f17bbda38b33db4f696dd3995f9ac76e1fdfac04b32a377" Dec 16 13:08:09 crc kubenswrapper[4757]: I1216 13:08:09.342173 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7lhwd" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.083109 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 13:08:10 crc kubenswrapper[4757]: E1216 13:08:10.083853 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e48bb858-bd6e-4dbc-a17a-5fd5e1275e00" containerName="cinder-db-sync" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.083869 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="e48bb858-bd6e-4dbc-a17a-5fd5e1275e00" containerName="cinder-db-sync" Dec 16 13:08:10 crc kubenswrapper[4757]: E1216 13:08:10.083902 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9a7054-7c7f-4e36-8d57-e095087a7878" containerName="neutron-db-sync" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.083910 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9a7054-7c7f-4e36-8d57-e095087a7878" containerName="neutron-db-sync" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.084130 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc9a7054-7c7f-4e36-8d57-e095087a7878" containerName="neutron-db-sync" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.084158 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="e48bb858-bd6e-4dbc-a17a-5fd5e1275e00" containerName="cinder-db-sync" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.088893 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.092709 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.118928 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qgfl8" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.125874 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.144247 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.225016 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4aae0516-59bd-448d-80be-c0df3dc002b9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.225442 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.225513 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66vsj\" (UniqueName: \"kubernetes.io/projected/4aae0516-59bd-448d-80be-c0df3dc002b9-kube-api-access-66vsj\") pod \"cinder-scheduler-0\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.225609 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.225659 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-config-data\") pod \"cinder-scheduler-0\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.225699 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-scripts\") pod \"cinder-scheduler-0\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.240818 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.268917 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-7fn6m"] Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.281100 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d9fbfd57d-79dn2"] Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.326125 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-vj64h"] Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.327698 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.328432 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4aae0516-59bd-448d-80be-c0df3dc002b9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.328594 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.328650 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66vsj\" (UniqueName: \"kubernetes.io/projected/4aae0516-59bd-448d-80be-c0df3dc002b9-kube-api-access-66vsj\") pod \"cinder-scheduler-0\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.328663 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4aae0516-59bd-448d-80be-c0df3dc002b9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.328750 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.328782 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-config-data\") pod \"cinder-scheduler-0\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.328814 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-scripts\") pod \"cinder-scheduler-0\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.340921 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-scripts\") pod \"cinder-scheduler-0\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.341426 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.343038 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-config-data\") pod \"cinder-scheduler-0\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.352594 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.388217 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-vj64h"] Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.408585 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66vsj\" (UniqueName: \"kubernetes.io/projected/4aae0516-59bd-448d-80be-c0df3dc002b9-kube-api-access-66vsj\") pod \"cinder-scheduler-0\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.423220 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.446452 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-vj64h\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.446536 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btrnr\" (UniqueName: \"kubernetes.io/projected/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-kube-api-access-btrnr\") pod \"dnsmasq-dns-848cf88cfc-vj64h\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.446686 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-vj64h\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.446707 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-config\") pod \"dnsmasq-dns-848cf88cfc-vj64h\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.446756 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-vj64h\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.446848 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-vj64h\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.557201 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-vj64h\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.557267 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btrnr\" (UniqueName: \"kubernetes.io/projected/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-kube-api-access-btrnr\") pod \"dnsmasq-dns-848cf88cfc-vj64h\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.557380 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-vj64h\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.557410 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-config\") pod \"dnsmasq-dns-848cf88cfc-vj64h\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.557457 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-vj64h\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.557519 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-vj64h\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.558784 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-vj64h\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.558799 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-config\") pod \"dnsmasq-dns-848cf88cfc-vj64h\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.558848 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-vj64h\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.559471 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-vj64h\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.565678 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-vj64h\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.581517 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-vj64h"] Dec 16 13:08:10 crc kubenswrapper[4757]: E1216 13:08:10.582286 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-btrnr], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" podUID="7e971f39-c94d-4a76-b58c-86d4e3be1aeb" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.621265 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btrnr\" (UniqueName: \"kubernetes.io/projected/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-kube-api-access-btrnr\") pod \"dnsmasq-dns-848cf88cfc-vj64h\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.667613 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9d8647f88-tzcvz"] Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.669531 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.684347 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.684558 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qwv54" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.684670 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.684849 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.713795 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-drpwc"] Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.715292 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.727217 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9d8647f88-tzcvz"] Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.761276 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msp9c\" (UniqueName: \"kubernetes.io/projected/7702799e-4dc6-4a4d-b479-59cae8163e3c-kube-api-access-msp9c\") pod \"neutron-9d8647f88-tzcvz\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.761319 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-config\") pod \"neutron-9d8647f88-tzcvz\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.761365 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-httpd-config\") pod \"neutron-9d8647f88-tzcvz\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.761380 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-ovndb-tls-certs\") pod \"neutron-9d8647f88-tzcvz\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.761395 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-combined-ca-bundle\") pod \"neutron-9d8647f88-tzcvz\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.774084 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-drpwc"] Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.865492 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-config\") pod \"dnsmasq-dns-6578955fd5-drpwc\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.865550 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8xfj\" (UniqueName: \"kubernetes.io/projected/5c88d603-6fdd-446a-a46c-990d30bacb6c-kube-api-access-k8xfj\") pod \"dnsmasq-dns-6578955fd5-drpwc\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.865579 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msp9c\" (UniqueName: \"kubernetes.io/projected/7702799e-4dc6-4a4d-b479-59cae8163e3c-kube-api-access-msp9c\") pod \"neutron-9d8647f88-tzcvz\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.865603 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-config\") pod \"neutron-9d8647f88-tzcvz\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.865621 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-drpwc\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.865650 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-ovndb-tls-certs\") pod \"neutron-9d8647f88-tzcvz\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.865666 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-httpd-config\") pod \"neutron-9d8647f88-tzcvz\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.865684 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-combined-ca-bundle\") pod \"neutron-9d8647f88-tzcvz\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.865740 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-drpwc\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.865767 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-dns-svc\") pod \"dnsmasq-dns-6578955fd5-drpwc\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.865811 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-drpwc\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.881812 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-ovndb-tls-certs\") pod \"neutron-9d8647f88-tzcvz\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.882321 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-httpd-config\") pod \"neutron-9d8647f88-tzcvz\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.885909 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-config\") pod \"neutron-9d8647f88-tzcvz\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.898838 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-combined-ca-bundle\") pod \"neutron-9d8647f88-tzcvz\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.918883 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msp9c\" (UniqueName: \"kubernetes.io/projected/7702799e-4dc6-4a4d-b479-59cae8163e3c-kube-api-access-msp9c\") pod \"neutron-9d8647f88-tzcvz\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.946221 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.950750 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.956668 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.967734 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8xfj\" (UniqueName: \"kubernetes.io/projected/5c88d603-6fdd-446a-a46c-990d30bacb6c-kube-api-access-k8xfj\") pod \"dnsmasq-dns-6578955fd5-drpwc\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.967790 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-drpwc\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.967882 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-drpwc\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.967919 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-dns-svc\") pod \"dnsmasq-dns-6578955fd5-drpwc\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.967972 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-drpwc\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.968053 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-config\") pod \"dnsmasq-dns-6578955fd5-drpwc\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.969103 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-config\") pod \"dnsmasq-dns-6578955fd5-drpwc\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.969147 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-drpwc\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.969812 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-dns-svc\") pod \"dnsmasq-dns-6578955fd5-drpwc\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.969819 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-drpwc\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.970365 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-drpwc\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:10 crc kubenswrapper[4757]: I1216 13:08:10.994422 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.002835 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.018465 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8xfj\" (UniqueName: \"kubernetes.io/projected/5c88d603-6fdd-446a-a46c-990d30bacb6c-kube-api-access-k8xfj\") pod \"dnsmasq-dns-6578955fd5-drpwc\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.037694 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.076860 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-config-data-custom\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.076959 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/66bf65f8-2687-4d20-b491-7a90392a3587-etc-machine-id\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.076988 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk8j9\" (UniqueName: \"kubernetes.io/projected/66bf65f8-2687-4d20-b491-7a90392a3587-kube-api-access-zk8j9\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.077040 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-config-data\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.077070 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.077129 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66bf65f8-2687-4d20-b491-7a90392a3587-logs\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.077153 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-scripts\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.178960 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-scripts\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.179124 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-config-data-custom\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.179232 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/66bf65f8-2687-4d20-b491-7a90392a3587-etc-machine-id\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.179296 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk8j9\" (UniqueName: \"kubernetes.io/projected/66bf65f8-2687-4d20-b491-7a90392a3587-kube-api-access-zk8j9\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.179358 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-config-data\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.179393 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.179471 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66bf65f8-2687-4d20-b491-7a90392a3587-logs\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.180151 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/66bf65f8-2687-4d20-b491-7a90392a3587-etc-machine-id\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.180299 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66bf65f8-2687-4d20-b491-7a90392a3587-logs\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.193610 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-config-data-custom\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.194075 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-scripts\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.196874 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.228370 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-config-data\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.236553 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk8j9\" (UniqueName: \"kubernetes.io/projected/66bf65f8-2687-4d20-b491-7a90392a3587-kube-api-access-zk8j9\") pod \"cinder-api-0\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.369960 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.379984 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.394853 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.469096 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-75ccc7d896-jmrk9" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.490365 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-dns-swift-storage-0\") pod \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.490472 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btrnr\" (UniqueName: \"kubernetes.io/projected/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-kube-api-access-btrnr\") pod \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.490522 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-ovsdbserver-sb\") pod \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.490563 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-ovsdbserver-nb\") pod \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.490600 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-config\") pod \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.490641 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-dns-svc\") pod \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\" (UID: \"7e971f39-c94d-4a76-b58c-86d4e3be1aeb\") " Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.491440 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e971f39-c94d-4a76-b58c-86d4e3be1aeb" (UID: "7e971f39-c94d-4a76-b58c-86d4e3be1aeb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.491663 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7e971f39-c94d-4a76-b58c-86d4e3be1aeb" (UID: "7e971f39-c94d-4a76-b58c-86d4e3be1aeb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.492392 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e971f39-c94d-4a76-b58c-86d4e3be1aeb" (UID: "7e971f39-c94d-4a76-b58c-86d4e3be1aeb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.492571 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e971f39-c94d-4a76-b58c-86d4e3be1aeb" (UID: "7e971f39-c94d-4a76-b58c-86d4e3be1aeb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.492965 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-config" (OuterVolumeSpecName: "config") pod "7e971f39-c94d-4a76-b58c-86d4e3be1aeb" (UID: "7e971f39-c94d-4a76-b58c-86d4e3be1aeb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.495298 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-kube-api-access-btrnr" (OuterVolumeSpecName: "kube-api-access-btrnr") pod "7e971f39-c94d-4a76-b58c-86d4e3be1aeb" (UID: "7e971f39-c94d-4a76-b58c-86d4e3be1aeb"). InnerVolumeSpecName "kube-api-access-btrnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.582436 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5d66ddf65b-lmltr" podUID="65337bd1-c674-4817-91c2-ad150639205c" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.592732 4757 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.592771 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btrnr\" (UniqueName: \"kubernetes.io/projected/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-kube-api-access-btrnr\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.592785 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.592795 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.592807 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:11 crc kubenswrapper[4757]: I1216 13:08:11.592818 4757 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e971f39-c94d-4a76-b58c-86d4e3be1aeb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:11 crc kubenswrapper[4757]: W1216 13:08:11.804138 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25963bc5_afd1_4703_a583_df0d8094117d.slice/crio-a6c6bb8b944e9c0caa81568499c401954994b6659e4de29d4cb497cac23b5214 WatchSource:0}: Error finding container a6c6bb8b944e9c0caa81568499c401954994b6659e4de29d4cb497cac23b5214: Status 404 returned error can't find the container with id a6c6bb8b944e9c0caa81568499c401954994b6659e4de29d4cb497cac23b5214 Dec 16 13:08:11 crc kubenswrapper[4757]: E1216 13:08:11.825337 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Dec 16 13:08:11 crc kubenswrapper[4757]: E1216 13:08:11.825552 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vs2rv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(32bc2074-3d53-44b4-8e6c-c500a2617944): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 13:08:11 crc kubenswrapper[4757]: E1216 13:08:11.827251 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"ceilometer-notification-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="32bc2074-3d53-44b4-8e6c-c500a2617944" Dec 16 13:08:12 crc kubenswrapper[4757]: I1216 13:08:12.396291 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="32bc2074-3d53-44b4-8e6c-c500a2617944" containerName="sg-core" containerID="cri-o://0ef145acfbfafa8bd433cce6a05108d27b07bea77164b78afc87588a8d6595b5" gracePeriod=30 Dec 16 13:08:12 crc kubenswrapper[4757]: I1216 13:08:12.396784 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" event={"ID":"25963bc5-afd1-4703-a583-df0d8094117d","Type":"ContainerStarted","Data":"a6c6bb8b944e9c0caa81568499c401954994b6659e4de29d4cb497cac23b5214"} Dec 16 13:08:12 crc kubenswrapper[4757]: I1216 13:08:12.396867 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-vj64h" Dec 16 13:08:12 crc kubenswrapper[4757]: I1216 13:08:12.580648 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-vj64h"] Dec 16 13:08:12 crc kubenswrapper[4757]: I1216 13:08:12.608360 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-vj64h"] Dec 16 13:08:12 crc kubenswrapper[4757]: I1216 13:08:12.684281 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cfff6bfd-qz5sk"] Dec 16 13:08:12 crc kubenswrapper[4757]: I1216 13:08:12.716609 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57f49b544b-vlhzn"] Dec 16 13:08:12 crc kubenswrapper[4757]: I1216 13:08:12.747888 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-drpwc"] Dec 16 13:08:12 crc kubenswrapper[4757]: I1216 13:08:12.829956 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-9bd99879-lw2rw"] Dec 16 13:08:12 crc kubenswrapper[4757]: I1216 13:08:12.982373 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e971f39-c94d-4a76-b58c-86d4e3be1aeb" path="/var/lib/kubelet/pods/7e971f39-c94d-4a76-b58c-86d4e3be1aeb/volumes" Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.277568 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-7fn6m"] Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.322102 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.348146 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.458897 4757 generic.go:334] "Generic (PLEG): container finished" podID="32bc2074-3d53-44b4-8e6c-c500a2617944" containerID="0ef145acfbfafa8bd433cce6a05108d27b07bea77164b78afc87588a8d6595b5" exitCode=2 Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.459718 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32bc2074-3d53-44b4-8e6c-c500a2617944","Type":"ContainerDied","Data":"0ef145acfbfafa8bd433cce6a05108d27b07bea77164b78afc87588a8d6595b5"} Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.461346 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57f49b544b-vlhzn" event={"ID":"3687417c-3bcd-48ea-b328-440ff4005a02","Type":"ContainerStarted","Data":"3326627538ad35c4498b98fc3e33b430703af1aaeda5cdcbbcdc565d73fdc2e3"} Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.461451 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57f49b544b-vlhzn" event={"ID":"3687417c-3bcd-48ea-b328-440ff4005a02","Type":"ContainerStarted","Data":"6e3528c57c1a61809818466ba59afca126ee19375132550b8ac3c4748b35c287"} Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.462390 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-9bd99879-lw2rw" event={"ID":"7d1df7bf-6c39-4e49-873a-701b8c05f900","Type":"ContainerStarted","Data":"9d10cbf41880fd13f536173498f03fe695ef4b4336864d4845c06a4578437c66"} Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.463185 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-drpwc" event={"ID":"5c88d603-6fdd-446a-a46c-990d30bacb6c","Type":"ContainerStarted","Data":"3195e0b4375087b59f05398ee7b152c8b6b8cc610e1641a9bde9cb141e378bc4"} Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.464346 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4aae0516-59bd-448d-80be-c0df3dc002b9","Type":"ContainerStarted","Data":"e577e5b2021d61edc71a91c4a143880aa7d69a8e06345475deae8f37aa3ad810"} Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.466222 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"66bf65f8-2687-4d20-b491-7a90392a3587","Type":"ContainerStarted","Data":"e4ff5c5f7e62bbec4d3ba8b7f2d999e403b61c2328002e87f4cfce552ea10f47"} Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.467785 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" event={"ID":"331a02af-2a7a-4a29-9709-ed1a45fa8046","Type":"ContainerStarted","Data":"baa4f2073f78d300c89d110e28c0bce070f7449a8553d06ef12e4e09c5284e80"} Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.480142 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cfff6bfd-qz5sk" event={"ID":"fd29da8f-05a6-43a9-a943-c6a8a4ef8479","Type":"ContainerStarted","Data":"7006a97dc644e63b8327ed97fd8c3298463140a13ddf837460cb0645e6a7b493"} Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.480201 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cfff6bfd-qz5sk" event={"ID":"fd29da8f-05a6-43a9-a943-c6a8a4ef8479","Type":"ContainerStarted","Data":"a5f671b8f2e0991bc7917665d7c915edd342ee60b4de78eef91441c86c7c21e3"} Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.497499 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.587321 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-scripts\") pod \"32bc2074-3d53-44b4-8e6c-c500a2617944\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.587379 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bc2074-3d53-44b4-8e6c-c500a2617944-run-httpd\") pod \"32bc2074-3d53-44b4-8e6c-c500a2617944\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.587500 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-sg-core-conf-yaml\") pod \"32bc2074-3d53-44b4-8e6c-c500a2617944\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.587536 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-config-data\") pod \"32bc2074-3d53-44b4-8e6c-c500a2617944\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.587562 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-combined-ca-bundle\") pod \"32bc2074-3d53-44b4-8e6c-c500a2617944\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.587581 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bc2074-3d53-44b4-8e6c-c500a2617944-log-httpd\") pod \"32bc2074-3d53-44b4-8e6c-c500a2617944\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.587689 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs2rv\" (UniqueName: \"kubernetes.io/projected/32bc2074-3d53-44b4-8e6c-c500a2617944-kube-api-access-vs2rv\") pod \"32bc2074-3d53-44b4-8e6c-c500a2617944\" (UID: \"32bc2074-3d53-44b4-8e6c-c500a2617944\") " Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.599462 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32bc2074-3d53-44b4-8e6c-c500a2617944-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "32bc2074-3d53-44b4-8e6c-c500a2617944" (UID: "32bc2074-3d53-44b4-8e6c-c500a2617944"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.599766 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32bc2074-3d53-44b4-8e6c-c500a2617944-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "32bc2074-3d53-44b4-8e6c-c500a2617944" (UID: "32bc2074-3d53-44b4-8e6c-c500a2617944"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.600356 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.642226 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-config-data" (OuterVolumeSpecName: "config-data") pod "32bc2074-3d53-44b4-8e6c-c500a2617944" (UID: "32bc2074-3d53-44b4-8e6c-c500a2617944"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.649785 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-scripts" (OuterVolumeSpecName: "scripts") pod "32bc2074-3d53-44b4-8e6c-c500a2617944" (UID: "32bc2074-3d53-44b4-8e6c-c500a2617944"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.652243 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32bc2074-3d53-44b4-8e6c-c500a2617944" (UID: "32bc2074-3d53-44b4-8e6c-c500a2617944"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.677979 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32bc2074-3d53-44b4-8e6c-c500a2617944-kube-api-access-vs2rv" (OuterVolumeSpecName: "kube-api-access-vs2rv") pod "32bc2074-3d53-44b4-8e6c-c500a2617944" (UID: "32bc2074-3d53-44b4-8e6c-c500a2617944"). InnerVolumeSpecName "kube-api-access-vs2rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.700607 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "32bc2074-3d53-44b4-8e6c-c500a2617944" (UID: "32bc2074-3d53-44b4-8e6c-c500a2617944"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.706794 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.706929 4757 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bc2074-3d53-44b4-8e6c-c500a2617944-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.707022 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs2rv\" (UniqueName: \"kubernetes.io/projected/32bc2074-3d53-44b4-8e6c-c500a2617944-kube-api-access-vs2rv\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.707088 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.707140 4757 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32bc2074-3d53-44b4-8e6c-c500a2617944-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.707203 4757 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.707254 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bc2074-3d53-44b4-8e6c-c500a2617944-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:13 crc kubenswrapper[4757]: I1216 13:08:13.753476 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9d8647f88-tzcvz"] Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.494757 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32bc2074-3d53-44b4-8e6c-c500a2617944","Type":"ContainerDied","Data":"e1e423eeac2f066df36b8340d33357cb2f2d56161cc7f3cd572a6cef88dec90e"} Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.495141 4757 scope.go:117] "RemoveContainer" containerID="0ef145acfbfafa8bd433cce6a05108d27b07bea77164b78afc87588a8d6595b5" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.494831 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.545409 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57f49b544b-vlhzn" event={"ID":"3687417c-3bcd-48ea-b328-440ff4005a02","Type":"ContainerStarted","Data":"f467372a9550884615b31602ab97a8c56899f33e3ebf192ae76e32005d8abdd0"} Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.545502 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.547338 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.558412 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9d8647f88-tzcvz" event={"ID":"7702799e-4dc6-4a4d-b479-59cae8163e3c","Type":"ContainerStarted","Data":"a25c39c18ad1ad4f6143348071a05dddb88a90f7dd3e1e455dd0fa66b19a0b46"} Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.558459 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9d8647f88-tzcvz" event={"ID":"7702799e-4dc6-4a4d-b479-59cae8163e3c","Type":"ContainerStarted","Data":"7f6fa46392ae864a1ac70b717574ffeb1219b90ef81f02999a71e33c58e04e1e"} Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.571646 4757 generic.go:334] "Generic (PLEG): container finished" podID="5c88d603-6fdd-446a-a46c-990d30bacb6c" containerID="f85f01c36d09719740a0b3caae5f88bff12825cabc5202009aa6e275af7d2e67" exitCode=0 Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.571728 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-drpwc" event={"ID":"5c88d603-6fdd-446a-a46c-990d30bacb6c","Type":"ContainerDied","Data":"f85f01c36d09719740a0b3caae5f88bff12825cabc5202009aa6e275af7d2e67"} Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.608905 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.608955 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.655656 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.692105 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.715388 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:08:14 crc kubenswrapper[4757]: E1216 13:08:14.719059 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32bc2074-3d53-44b4-8e6c-c500a2617944" containerName="sg-core" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.719098 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="32bc2074-3d53-44b4-8e6c-c500a2617944" containerName="sg-core" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.719339 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="32bc2074-3d53-44b4-8e6c-c500a2617944" containerName="sg-core" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.723733 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.741097 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.741355 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.818761 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-57f49b544b-vlhzn" podStartSLOduration=12.818737644 podStartE2EDuration="12.818737644s" podCreationTimestamp="2025-12-16 13:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:08:14.635616591 +0000 UTC m=+1280.063360397" watchObservedRunningTime="2025-12-16 13:08:14.818737644 +0000 UTC m=+1280.246481440" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.899258 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d90963c3-d526-4ee8-a945-9c8cb8868a9a-log-httpd\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.899315 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28lmd\" (UniqueName: \"kubernetes.io/projected/d90963c3-d526-4ee8-a945-9c8cb8868a9a-kube-api-access-28lmd\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.899443 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.899513 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-config-data\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.899540 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-scripts\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.899610 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:14 crc kubenswrapper[4757]: I1216 13:08:14.899631 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d90963c3-d526-4ee8-a945-9c8cb8868a9a-run-httpd\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.000969 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.001067 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-config-data\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.001099 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-scripts\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.001567 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.001597 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d90963c3-d526-4ee8-a945-9c8cb8868a9a-run-httpd\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.001638 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d90963c3-d526-4ee8-a945-9c8cb8868a9a-log-httpd\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.001677 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28lmd\" (UniqueName: \"kubernetes.io/projected/d90963c3-d526-4ee8-a945-9c8cb8868a9a-kube-api-access-28lmd\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.002614 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d90963c3-d526-4ee8-a945-9c8cb8868a9a-run-httpd\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.002992 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d90963c3-d526-4ee8-a945-9c8cb8868a9a-log-httpd\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.007263 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-scripts\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.013313 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.017830 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.022235 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7cfff6bfd-qz5sk" podStartSLOduration=9.022211888 podStartE2EDuration="9.022211888s" podCreationTimestamp="2025-12-16 13:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:08:14.754951685 +0000 UTC m=+1280.182695501" watchObservedRunningTime="2025-12-16 13:08:15.022211888 +0000 UTC m=+1280.449955684" Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.024191 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28lmd\" (UniqueName: \"kubernetes.io/projected/d90963c3-d526-4ee8-a945-9c8cb8868a9a-kube-api-access-28lmd\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.030750 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-config-data\") pod \"ceilometer-0\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " pod="openstack/ceilometer-0" Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.033324 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32bc2074-3d53-44b4-8e6c-c500a2617944" path="/var/lib/kubelet/pods/32bc2074-3d53-44b4-8e6c-c500a2617944/volumes" Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.033946 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.254463 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:08:15 crc kubenswrapper[4757]: E1216 13:08:15.482809 4757 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod331a02af_2a7a_4a29_9709_ed1a45fa8046.slice/crio-012b674f667d34e578cfb7d859bb6dbdefd47e2362455c8492b4e95435ed8374.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod331a02af_2a7a_4a29_9709_ed1a45fa8046.slice/crio-conmon-012b674f667d34e578cfb7d859bb6dbdefd47e2362455c8492b4e95435ed8374.scope\": RecentStats: unable to find data in memory cache]" Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.747812 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"66bf65f8-2687-4d20-b491-7a90392a3587","Type":"ContainerStarted","Data":"38c1a3b8f7de40d01d604585c03558a3e667b153c0f361d45d34330a1222e48b"} Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.756628 4757 generic.go:334] "Generic (PLEG): container finished" podID="331a02af-2a7a-4a29-9709-ed1a45fa8046" containerID="012b674f667d34e578cfb7d859bb6dbdefd47e2362455c8492b4e95435ed8374" exitCode=0 Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.756726 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" event={"ID":"331a02af-2a7a-4a29-9709-ed1a45fa8046","Type":"ContainerDied","Data":"012b674f667d34e578cfb7d859bb6dbdefd47e2362455c8492b4e95435ed8374"} Dec 16 13:08:15 crc kubenswrapper[4757]: I1216 13:08:15.786867 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cfff6bfd-qz5sk" event={"ID":"fd29da8f-05a6-43a9-a943-c6a8a4ef8479","Type":"ContainerStarted","Data":"f49fad4dfc401c5629a41a9bed27cc1e9ca9eba0c580898f07d8f3eb2febb94c"} Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.190589 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.509843 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.598265 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-ovsdbserver-sb\") pod \"331a02af-2a7a-4a29-9709-ed1a45fa8046\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.598498 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-config\") pod \"331a02af-2a7a-4a29-9709-ed1a45fa8046\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.598592 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7qtr\" (UniqueName: \"kubernetes.io/projected/331a02af-2a7a-4a29-9709-ed1a45fa8046-kube-api-access-p7qtr\") pod \"331a02af-2a7a-4a29-9709-ed1a45fa8046\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.598707 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-dns-svc\") pod \"331a02af-2a7a-4a29-9709-ed1a45fa8046\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.598770 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-ovsdbserver-nb\") pod \"331a02af-2a7a-4a29-9709-ed1a45fa8046\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.598834 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-dns-swift-storage-0\") pod \"331a02af-2a7a-4a29-9709-ed1a45fa8046\" (UID: \"331a02af-2a7a-4a29-9709-ed1a45fa8046\") " Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.684272 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-config" (OuterVolumeSpecName: "config") pod "331a02af-2a7a-4a29-9709-ed1a45fa8046" (UID: "331a02af-2a7a-4a29-9709-ed1a45fa8046"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.704539 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.752385 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "331a02af-2a7a-4a29-9709-ed1a45fa8046" (UID: "331a02af-2a7a-4a29-9709-ed1a45fa8046"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.770269 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/331a02af-2a7a-4a29-9709-ed1a45fa8046-kube-api-access-p7qtr" (OuterVolumeSpecName: "kube-api-access-p7qtr") pod "331a02af-2a7a-4a29-9709-ed1a45fa8046" (UID: "331a02af-2a7a-4a29-9709-ed1a45fa8046"). InnerVolumeSpecName "kube-api-access-p7qtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.786238 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "331a02af-2a7a-4a29-9709-ed1a45fa8046" (UID: "331a02af-2a7a-4a29-9709-ed1a45fa8046"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.795180 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "331a02af-2a7a-4a29-9709-ed1a45fa8046" (UID: "331a02af-2a7a-4a29-9709-ed1a45fa8046"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.807162 4757 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.807203 4757 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.807218 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.807230 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7qtr\" (UniqueName: \"kubernetes.io/projected/331a02af-2a7a-4a29-9709-ed1a45fa8046-kube-api-access-p7qtr\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.816154 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "331a02af-2a7a-4a29-9709-ed1a45fa8046" (UID: "331a02af-2a7a-4a29-9709-ed1a45fa8046"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.824894 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9d8647f88-tzcvz" event={"ID":"7702799e-4dc6-4a4d-b479-59cae8163e3c","Type":"ContainerStarted","Data":"af24427995b417ea48bb519238512682f9cdda556b4a976136f4fb16060e66ba"} Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.828162 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.846415 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-drpwc" event={"ID":"5c88d603-6fdd-446a-a46c-990d30bacb6c","Type":"ContainerStarted","Data":"94e6c3bc2979f5662ca293d51a8a7ca0bf30fe102fe76030e47ac6f7da4b0a31"} Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.846463 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.909027 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/331a02af-2a7a-4a29-9709-ed1a45fa8046-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.926041 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-9d8647f88-tzcvz" podStartSLOduration=6.925912262 podStartE2EDuration="6.925912262s" podCreationTimestamp="2025-12-16 13:08:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:08:16.859036075 +0000 UTC m=+1282.286779891" watchObservedRunningTime="2025-12-16 13:08:16.925912262 +0000 UTC m=+1282.353656058" Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.937141 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d90963c3-d526-4ee8-a945-9c8cb8868a9a","Type":"ContainerStarted","Data":"c81d1e9b1e848b506ee4ee5a6675a1ccc27f74d8cf2dae592baa1d780ee7820b"} Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.940521 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-drpwc" podStartSLOduration=6.940498536 podStartE2EDuration="6.940498536s" podCreationTimestamp="2025-12-16 13:08:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:08:16.917513956 +0000 UTC m=+1282.345257752" watchObservedRunningTime="2025-12-16 13:08:16.940498536 +0000 UTC m=+1282.368242332" Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.940745 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.942249 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67bffd47-7fn6m" event={"ID":"331a02af-2a7a-4a29-9709-ed1a45fa8046","Type":"ContainerDied","Data":"baa4f2073f78d300c89d110e28c0bce070f7449a8553d06ef12e4e09c5284e80"} Dec 16 13:08:16 crc kubenswrapper[4757]: I1216 13:08:16.942307 4757 scope.go:117] "RemoveContainer" containerID="012b674f667d34e578cfb7d859bb6dbdefd47e2362455c8492b4e95435ed8374" Dec 16 13:08:17 crc kubenswrapper[4757]: I1216 13:08:17.121070 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-7fn6m"] Dec 16 13:08:17 crc kubenswrapper[4757]: I1216 13:08:17.147576 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-7fn6m"] Dec 16 13:08:17 crc kubenswrapper[4757]: I1216 13:08:17.950407 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4aae0516-59bd-448d-80be-c0df3dc002b9","Type":"ContainerStarted","Data":"8bae132a84fa051a54bfa956d7fd5105f138ee7d6175341fcc8c961ba95ccb30"} Dec 16 13:08:17 crc kubenswrapper[4757]: I1216 13:08:17.982945 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7f64c6bbf7-pnthz"] Dec 16 13:08:17 crc kubenswrapper[4757]: E1216 13:08:17.983692 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="331a02af-2a7a-4a29-9709-ed1a45fa8046" containerName="init" Dec 16 13:08:17 crc kubenswrapper[4757]: I1216 13:08:17.983795 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="331a02af-2a7a-4a29-9709-ed1a45fa8046" containerName="init" Dec 16 13:08:17 crc kubenswrapper[4757]: I1216 13:08:17.984142 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="331a02af-2a7a-4a29-9709-ed1a45fa8046" containerName="init" Dec 16 13:08:17 crc kubenswrapper[4757]: I1216 13:08:17.985712 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:17 crc kubenswrapper[4757]: I1216 13:08:17.988304 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 16 13:08:17 crc kubenswrapper[4757]: I1216 13:08:17.988562 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.001899 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f64c6bbf7-pnthz"] Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.056880 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cac5be05-fb05-4246-86e5-2b8dbdbffd04-config\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.056983 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cac5be05-fb05-4246-86e5-2b8dbdbffd04-httpd-config\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.057044 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cac5be05-fb05-4246-86e5-2b8dbdbffd04-ovndb-tls-certs\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.057139 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfpwk\" (UniqueName: \"kubernetes.io/projected/cac5be05-fb05-4246-86e5-2b8dbdbffd04-kube-api-access-dfpwk\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.057215 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cac5be05-fb05-4246-86e5-2b8dbdbffd04-internal-tls-certs\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.057322 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cac5be05-fb05-4246-86e5-2b8dbdbffd04-combined-ca-bundle\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.057392 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cac5be05-fb05-4246-86e5-2b8dbdbffd04-public-tls-certs\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.158704 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cac5be05-fb05-4246-86e5-2b8dbdbffd04-httpd-config\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.158773 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cac5be05-fb05-4246-86e5-2b8dbdbffd04-ovndb-tls-certs\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.158885 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfpwk\" (UniqueName: \"kubernetes.io/projected/cac5be05-fb05-4246-86e5-2b8dbdbffd04-kube-api-access-dfpwk\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.158934 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cac5be05-fb05-4246-86e5-2b8dbdbffd04-internal-tls-certs\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.158982 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cac5be05-fb05-4246-86e5-2b8dbdbffd04-combined-ca-bundle\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.159040 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cac5be05-fb05-4246-86e5-2b8dbdbffd04-public-tls-certs\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.159107 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cac5be05-fb05-4246-86e5-2b8dbdbffd04-config\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.165037 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cac5be05-fb05-4246-86e5-2b8dbdbffd04-config\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.166600 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cac5be05-fb05-4246-86e5-2b8dbdbffd04-internal-tls-certs\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.166691 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cac5be05-fb05-4246-86e5-2b8dbdbffd04-public-tls-certs\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.166988 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cac5be05-fb05-4246-86e5-2b8dbdbffd04-combined-ca-bundle\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.172509 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cac5be05-fb05-4246-86e5-2b8dbdbffd04-httpd-config\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.172746 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cac5be05-fb05-4246-86e5-2b8dbdbffd04-ovndb-tls-certs\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.201860 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfpwk\" (UniqueName: \"kubernetes.io/projected/cac5be05-fb05-4246-86e5-2b8dbdbffd04-kube-api-access-dfpwk\") pod \"neutron-7f64c6bbf7-pnthz\" (UID: \"cac5be05-fb05-4246-86e5-2b8dbdbffd04\") " pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.305828 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.963139 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="331a02af-2a7a-4a29-9709-ed1a45fa8046" path="/var/lib/kubelet/pods/331a02af-2a7a-4a29-9709-ed1a45fa8046/volumes" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.969802 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"66bf65f8-2687-4d20-b491-7a90392a3587","Type":"ContainerStarted","Data":"8443aba7a557a48c800d32d1f2508d8179591e6aeecd0822404cad2d6adafe50"} Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.970097 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="66bf65f8-2687-4d20-b491-7a90392a3587" containerName="cinder-api-log" containerID="cri-o://38c1a3b8f7de40d01d604585c03558a3e667b153c0f361d45d34330a1222e48b" gracePeriod=30 Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.970554 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 16 13:08:18 crc kubenswrapper[4757]: I1216 13:08:18.970709 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="66bf65f8-2687-4d20-b491-7a90392a3587" containerName="cinder-api" containerID="cri-o://8443aba7a557a48c800d32d1f2508d8179591e6aeecd0822404cad2d6adafe50" gracePeriod=30 Dec 16 13:08:19 crc kubenswrapper[4757]: I1216 13:08:19.066078 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=9.066054664 podStartE2EDuration="9.066054664s" podCreationTimestamp="2025-12-16 13:08:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:08:19.045652311 +0000 UTC m=+1284.473396127" watchObservedRunningTime="2025-12-16 13:08:19.066054664 +0000 UTC m=+1284.493798460" Dec 16 13:08:19 crc kubenswrapper[4757]: I1216 13:08:19.980814 4757 generic.go:334] "Generic (PLEG): container finished" podID="66bf65f8-2687-4d20-b491-7a90392a3587" containerID="8443aba7a557a48c800d32d1f2508d8179591e6aeecd0822404cad2d6adafe50" exitCode=0 Dec 16 13:08:19 crc kubenswrapper[4757]: I1216 13:08:19.981122 4757 generic.go:334] "Generic (PLEG): container finished" podID="66bf65f8-2687-4d20-b491-7a90392a3587" containerID="38c1a3b8f7de40d01d604585c03558a3e667b153c0f361d45d34330a1222e48b" exitCode=143 Dec 16 13:08:19 crc kubenswrapper[4757]: I1216 13:08:19.980898 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"66bf65f8-2687-4d20-b491-7a90392a3587","Type":"ContainerDied","Data":"8443aba7a557a48c800d32d1f2508d8179591e6aeecd0822404cad2d6adafe50"} Dec 16 13:08:19 crc kubenswrapper[4757]: I1216 13:08:19.981155 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"66bf65f8-2687-4d20-b491-7a90392a3587","Type":"ContainerDied","Data":"38c1a3b8f7de40d01d604585c03558a3e667b153c0f361d45d34330a1222e48b"} Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.405348 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.523102 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk8j9\" (UniqueName: \"kubernetes.io/projected/66bf65f8-2687-4d20-b491-7a90392a3587-kube-api-access-zk8j9\") pod \"66bf65f8-2687-4d20-b491-7a90392a3587\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.523177 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66bf65f8-2687-4d20-b491-7a90392a3587-logs\") pod \"66bf65f8-2687-4d20-b491-7a90392a3587\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.523197 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-config-data-custom\") pod \"66bf65f8-2687-4d20-b491-7a90392a3587\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.523233 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-combined-ca-bundle\") pod \"66bf65f8-2687-4d20-b491-7a90392a3587\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.523250 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/66bf65f8-2687-4d20-b491-7a90392a3587-etc-machine-id\") pod \"66bf65f8-2687-4d20-b491-7a90392a3587\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.523276 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-scripts\") pod \"66bf65f8-2687-4d20-b491-7a90392a3587\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.523331 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-config-data\") pod \"66bf65f8-2687-4d20-b491-7a90392a3587\" (UID: \"66bf65f8-2687-4d20-b491-7a90392a3587\") " Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.539176 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66bf65f8-2687-4d20-b491-7a90392a3587-logs" (OuterVolumeSpecName: "logs") pod "66bf65f8-2687-4d20-b491-7a90392a3587" (UID: "66bf65f8-2687-4d20-b491-7a90392a3587"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.539629 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66bf65f8-2687-4d20-b491-7a90392a3587-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "66bf65f8-2687-4d20-b491-7a90392a3587" (UID: "66bf65f8-2687-4d20-b491-7a90392a3587"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.542237 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "66bf65f8-2687-4d20-b491-7a90392a3587" (UID: "66bf65f8-2687-4d20-b491-7a90392a3587"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.542604 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66bf65f8-2687-4d20-b491-7a90392a3587-kube-api-access-zk8j9" (OuterVolumeSpecName: "kube-api-access-zk8j9") pod "66bf65f8-2687-4d20-b491-7a90392a3587" (UID: "66bf65f8-2687-4d20-b491-7a90392a3587"). InnerVolumeSpecName "kube-api-access-zk8j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.550403 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-scripts" (OuterVolumeSpecName: "scripts") pod "66bf65f8-2687-4d20-b491-7a90392a3587" (UID: "66bf65f8-2687-4d20-b491-7a90392a3587"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.623202 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66bf65f8-2687-4d20-b491-7a90392a3587" (UID: "66bf65f8-2687-4d20-b491-7a90392a3587"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.625395 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk8j9\" (UniqueName: \"kubernetes.io/projected/66bf65f8-2687-4d20-b491-7a90392a3587-kube-api-access-zk8j9\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.625495 4757 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66bf65f8-2687-4d20-b491-7a90392a3587-logs\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.625569 4757 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.625631 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.625685 4757 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/66bf65f8-2687-4d20-b491-7a90392a3587-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.625735 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.653301 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-config-data" (OuterVolumeSpecName: "config-data") pod "66bf65f8-2687-4d20-b491-7a90392a3587" (UID: "66bf65f8-2687-4d20-b491-7a90392a3587"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.731340 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66bf65f8-2687-4d20-b491-7a90392a3587-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:20 crc kubenswrapper[4757]: I1216 13:08:20.912094 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f64c6bbf7-pnthz"] Dec 16 13:08:20 crc kubenswrapper[4757]: W1216 13:08:20.989872 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcac5be05_fb05_4246_86e5_2b8dbdbffd04.slice/crio-c634eee1620135c3f61e21b9e93789f3bfea88c54aedad75752a350eb0a444e9 WatchSource:0}: Error finding container c634eee1620135c3f61e21b9e93789f3bfea88c54aedad75752a350eb0a444e9: Status 404 returned error can't find the container with id c634eee1620135c3f61e21b9e93789f3bfea88c54aedad75752a350eb0a444e9 Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.065835 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.155701 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-s2sxx"] Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.155996 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" podUID="d708ede9-7a1e-4baa-9c7a-21bb0007c6ee" containerName="dnsmasq-dns" containerID="cri-o://c69fbe380803057a8c809584828b3c2d68ee3a7ccfd071848ee98903f953142e" gracePeriod=10 Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.173744 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"66bf65f8-2687-4d20-b491-7a90392a3587","Type":"ContainerDied","Data":"e4ff5c5f7e62bbec4d3ba8b7f2d999e403b61c2328002e87f4cfce552ea10f47"} Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.173804 4757 scope.go:117] "RemoveContainer" containerID="8443aba7a557a48c800d32d1f2508d8179591e6aeecd0822404cad2d6adafe50" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.173955 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.183100 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.183153 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.183194 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.183868 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"feaab26a71eb3b6535920da4cbeacb812adf972fd9cda852a626b3b8fac4ff4e"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.184474 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://feaab26a71eb3b6535920da4cbeacb812adf972fd9cda852a626b3b8fac4ff4e" gracePeriod=600 Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.185228 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f64c6bbf7-pnthz" event={"ID":"cac5be05-fb05-4246-86e5-2b8dbdbffd04","Type":"ContainerStarted","Data":"c634eee1620135c3f61e21b9e93789f3bfea88c54aedad75752a350eb0a444e9"} Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.237459 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.242322 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" event={"ID":"25963bc5-afd1-4703-a583-df0d8094117d","Type":"ContainerStarted","Data":"4771565d45ceeb15cfca9880af112cc03fc054d8d9a1bcdc27e283c584d7b189"} Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.265824 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.307222 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 16 13:08:21 crc kubenswrapper[4757]: E1216 13:08:21.308082 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66bf65f8-2687-4d20-b491-7a90392a3587" containerName="cinder-api" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.308106 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="66bf65f8-2687-4d20-b491-7a90392a3587" containerName="cinder-api" Dec 16 13:08:21 crc kubenswrapper[4757]: E1216 13:08:21.308146 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66bf65f8-2687-4d20-b491-7a90392a3587" containerName="cinder-api-log" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.308155 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="66bf65f8-2687-4d20-b491-7a90392a3587" containerName="cinder-api-log" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.308387 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="66bf65f8-2687-4d20-b491-7a90392a3587" containerName="cinder-api-log" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.308428 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="66bf65f8-2687-4d20-b491-7a90392a3587" containerName="cinder-api" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.310694 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.319485 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.319692 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.319839 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.351932 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b303e0-e076-4589-9fb3-b51f998a293e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.392119 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.370237 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx9dj\" (UniqueName: \"kubernetes.io/projected/e2b303e0-e076-4589-9fb3-b51f998a293e-kube-api-access-nx9dj\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.464383 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2b303e0-e076-4589-9fb3-b51f998a293e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.464659 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b303e0-e076-4589-9fb3-b51f998a293e-config-data\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.464813 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2b303e0-e076-4589-9fb3-b51f998a293e-logs\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.464984 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b303e0-e076-4589-9fb3-b51f998a293e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.465171 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b303e0-e076-4589-9fb3-b51f998a293e-scripts\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.465349 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b303e0-e076-4589-9fb3-b51f998a293e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.465567 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2b303e0-e076-4589-9fb3-b51f998a293e-config-data-custom\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.456968 4757 scope.go:117] "RemoveContainer" containerID="38c1a3b8f7de40d01d604585c03558a3e667b153c0f361d45d34330a1222e48b" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.474773 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-75ccc7d896-jmrk9" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.479701 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.480772 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"25ab2be0f535088a98cd5974a570660c2b1ab7f032761874bdf1659a40210f03"} pod="openstack/horizon-75ccc7d896-jmrk9" containerMessage="Container horizon failed startup probe, will be restarted" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.490041 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-75ccc7d896-jmrk9" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" containerID="cri-o://25ab2be0f535088a98cd5974a570660c2b1ab7f032761874bdf1659a40210f03" gracePeriod=30 Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.581475 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2b303e0-e076-4589-9fb3-b51f998a293e-logs\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.581538 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b303e0-e076-4589-9fb3-b51f998a293e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.581575 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b303e0-e076-4589-9fb3-b51f998a293e-scripts\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.581659 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b303e0-e076-4589-9fb3-b51f998a293e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.581710 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2b303e0-e076-4589-9fb3-b51f998a293e-config-data-custom\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.581749 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b303e0-e076-4589-9fb3-b51f998a293e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.581792 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx9dj\" (UniqueName: \"kubernetes.io/projected/e2b303e0-e076-4589-9fb3-b51f998a293e-kube-api-access-nx9dj\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.581822 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2b303e0-e076-4589-9fb3-b51f998a293e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.584830 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2b303e0-e076-4589-9fb3-b51f998a293e-logs\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.589610 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5d66ddf65b-lmltr" podUID="65337bd1-c674-4817-91c2-ad150639205c" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.589677 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.601489 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b303e0-e076-4589-9fb3-b51f998a293e-config-data\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.619183 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"357eee533356136a47d50ddfe4b10cb06996ef66793b34da2123f1bd22018055"} pod="openstack/horizon-5d66ddf65b-lmltr" containerMessage="Container horizon failed startup probe, will be restarted" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.619257 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5d66ddf65b-lmltr" podUID="65337bd1-c674-4817-91c2-ad150639205c" containerName="horizon" containerID="cri-o://357eee533356136a47d50ddfe4b10cb06996ef66793b34da2123f1bd22018055" gracePeriod=30 Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.619857 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b303e0-e076-4589-9fb3-b51f998a293e-scripts\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.619430 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2b303e0-e076-4589-9fb3-b51f998a293e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.620927 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b303e0-e076-4589-9fb3-b51f998a293e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.643458 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b303e0-e076-4589-9fb3-b51f998a293e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.650314 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b303e0-e076-4589-9fb3-b51f998a293e-config-data\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.656101 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx9dj\" (UniqueName: \"kubernetes.io/projected/e2b303e0-e076-4589-9fb3-b51f998a293e-kube-api-access-nx9dj\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.658896 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b303e0-e076-4589-9fb3-b51f998a293e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.669094 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2b303e0-e076-4589-9fb3-b51f998a293e-config-data-custom\") pod \"cinder-api-0\" (UID: \"e2b303e0-e076-4589-9fb3-b51f998a293e\") " pod="openstack/cinder-api-0" Dec 16 13:08:21 crc kubenswrapper[4757]: I1216 13:08:21.763598 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.116206 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.231019 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-dns-swift-storage-0\") pod \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.231080 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-ovsdbserver-sb\") pod \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.231298 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgnhh\" (UniqueName: \"kubernetes.io/projected/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-kube-api-access-cgnhh\") pod \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.231343 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-config\") pod \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.231377 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-dns-svc\") pod \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.231437 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-ovsdbserver-nb\") pod \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\" (UID: \"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee\") " Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.255608 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-kube-api-access-cgnhh" (OuterVolumeSpecName: "kube-api-access-cgnhh") pod "d708ede9-7a1e-4baa-9c7a-21bb0007c6ee" (UID: "d708ede9-7a1e-4baa-9c7a-21bb0007c6ee"). InnerVolumeSpecName "kube-api-access-cgnhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.334273 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgnhh\" (UniqueName: \"kubernetes.io/projected/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-kube-api-access-cgnhh\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.354900 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" event={"ID":"25963bc5-afd1-4703-a583-df0d8094117d","Type":"ContainerStarted","Data":"b50e1fb4bb9c35315781549991e5a9a2ccccb7506442f1ae0d2146a9066b3123"} Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.368570 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-9bd99879-lw2rw" event={"ID":"7d1df7bf-6c39-4e49-873a-701b8c05f900","Type":"ContainerStarted","Data":"276fc9090503a9ada492fedaa8a1ff2ddc23b7a0498f992b1e5c559caa0fdcb8"} Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.381496 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4aae0516-59bd-448d-80be-c0df3dc002b9","Type":"ContainerStarted","Data":"dc8d830b3117d76d282e99f26579e97228d766095d176f87980a668cca428d15"} Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.386983 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d90963c3-d526-4ee8-a945-9c8cb8868a9a","Type":"ContainerStarted","Data":"af86fd7066c4336b80c6968c3fc8b3201c18d818a5c052797bf3272f6b25c492"} Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.405103 4757 generic.go:334] "Generic (PLEG): container finished" podID="d708ede9-7a1e-4baa-9c7a-21bb0007c6ee" containerID="c69fbe380803057a8c809584828b3c2d68ee3a7ccfd071848ee98903f953142e" exitCode=0 Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.405372 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" event={"ID":"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee","Type":"ContainerDied","Data":"c69fbe380803057a8c809584828b3c2d68ee3a7ccfd071848ee98903f953142e"} Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.405479 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" event={"ID":"d708ede9-7a1e-4baa-9c7a-21bb0007c6ee","Type":"ContainerDied","Data":"a19d2e5427d2d6b4409ce88916f6139b60e936a6207597a096d0cdb246ffea98"} Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.405564 4757 scope.go:117] "RemoveContainer" containerID="c69fbe380803057a8c809584828b3c2d68ee3a7ccfd071848ee98903f953142e" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.405741 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-s2sxx" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.407473 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5d9fbfd57d-79dn2" podStartSLOduration=13.159183516 podStartE2EDuration="21.407450869s" podCreationTimestamp="2025-12-16 13:08:01 +0000 UTC" firstStartedPulling="2025-12-16 13:08:11.825842158 +0000 UTC m=+1277.253585954" lastFinishedPulling="2025-12-16 13:08:20.074109511 +0000 UTC m=+1285.501853307" observedRunningTime="2025-12-16 13:08:22.387585657 +0000 UTC m=+1287.815329453" watchObservedRunningTime="2025-12-16 13:08:22.407450869 +0000 UTC m=+1287.835194665" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.453490 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=10.464929185999999 podStartE2EDuration="13.453469483s" podCreationTimestamp="2025-12-16 13:08:09 +0000 UTC" firstStartedPulling="2025-12-16 13:08:13.351821753 +0000 UTC m=+1278.779565549" lastFinishedPulling="2025-12-16 13:08:16.34036205 +0000 UTC m=+1281.768105846" observedRunningTime="2025-12-16 13:08:22.418260489 +0000 UTC m=+1287.846004285" watchObservedRunningTime="2025-12-16 13:08:22.453469483 +0000 UTC m=+1287.881213279" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.463170 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="feaab26a71eb3b6535920da4cbeacb812adf972fd9cda852a626b3b8fac4ff4e" exitCode=0 Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.463475 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"feaab26a71eb3b6535920da4cbeacb812adf972fd9cda852a626b3b8fac4ff4e"} Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.486217 4757 scope.go:117] "RemoveContainer" containerID="d4f19d898e28c9d3c23a326aa417713f08fd2b11f12443eae761937505148f1a" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.600096 4757 scope.go:117] "RemoveContainer" containerID="c69fbe380803057a8c809584828b3c2d68ee3a7ccfd071848ee98903f953142e" Dec 16 13:08:22 crc kubenswrapper[4757]: E1216 13:08:22.600991 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c69fbe380803057a8c809584828b3c2d68ee3a7ccfd071848ee98903f953142e\": container with ID starting with c69fbe380803057a8c809584828b3c2d68ee3a7ccfd071848ee98903f953142e not found: ID does not exist" containerID="c69fbe380803057a8c809584828b3c2d68ee3a7ccfd071848ee98903f953142e" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.601050 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c69fbe380803057a8c809584828b3c2d68ee3a7ccfd071848ee98903f953142e"} err="failed to get container status \"c69fbe380803057a8c809584828b3c2d68ee3a7ccfd071848ee98903f953142e\": rpc error: code = NotFound desc = could not find container \"c69fbe380803057a8c809584828b3c2d68ee3a7ccfd071848ee98903f953142e\": container with ID starting with c69fbe380803057a8c809584828b3c2d68ee3a7ccfd071848ee98903f953142e not found: ID does not exist" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.601078 4757 scope.go:117] "RemoveContainer" containerID="d4f19d898e28c9d3c23a326aa417713f08fd2b11f12443eae761937505148f1a" Dec 16 13:08:22 crc kubenswrapper[4757]: E1216 13:08:22.612794 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f19d898e28c9d3c23a326aa417713f08fd2b11f12443eae761937505148f1a\": container with ID starting with d4f19d898e28c9d3c23a326aa417713f08fd2b11f12443eae761937505148f1a not found: ID does not exist" containerID="d4f19d898e28c9d3c23a326aa417713f08fd2b11f12443eae761937505148f1a" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.612844 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f19d898e28c9d3c23a326aa417713f08fd2b11f12443eae761937505148f1a"} err="failed to get container status \"d4f19d898e28c9d3c23a326aa417713f08fd2b11f12443eae761937505148f1a\": rpc error: code = NotFound desc = could not find container \"d4f19d898e28c9d3c23a326aa417713f08fd2b11f12443eae761937505148f1a\": container with ID starting with d4f19d898e28c9d3c23a326aa417713f08fd2b11f12443eae761937505148f1a not found: ID does not exist" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.612870 4757 scope.go:117] "RemoveContainer" containerID="a3d5810574004acc14ba78a28c621226c4b91fbf94cdc448f59cddf05b4cabae" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.674244 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d708ede9-7a1e-4baa-9c7a-21bb0007c6ee" (UID: "d708ede9-7a1e-4baa-9c7a-21bb0007c6ee"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.726271 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d708ede9-7a1e-4baa-9c7a-21bb0007c6ee" (UID: "d708ede9-7a1e-4baa-9c7a-21bb0007c6ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.726814 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d708ede9-7a1e-4baa-9c7a-21bb0007c6ee" (UID: "d708ede9-7a1e-4baa-9c7a-21bb0007c6ee"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.751517 4757 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.751540 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.751551 4757 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.796694 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d708ede9-7a1e-4baa-9c7a-21bb0007c6ee" (UID: "d708ede9-7a1e-4baa-9c7a-21bb0007c6ee"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.805972 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-config" (OuterVolumeSpecName: "config") pod "d708ede9-7a1e-4baa-9c7a-21bb0007c6ee" (UID: "d708ede9-7a1e-4baa-9c7a-21bb0007c6ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.865374 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.865433 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.940911 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 13:08:22 crc kubenswrapper[4757]: I1216 13:08:22.993163 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66bf65f8-2687-4d20-b491-7a90392a3587" path="/var/lib/kubelet/pods/66bf65f8-2687-4d20-b491-7a90392a3587/volumes" Dec 16 13:08:23 crc kubenswrapper[4757]: I1216 13:08:23.148966 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-s2sxx"] Dec 16 13:08:23 crc kubenswrapper[4757]: I1216 13:08:23.159842 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-s2sxx"] Dec 16 13:08:23 crc kubenswrapper[4757]: I1216 13:08:23.515722 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3"} Dec 16 13:08:23 crc kubenswrapper[4757]: I1216 13:08:23.522580 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f64c6bbf7-pnthz" event={"ID":"cac5be05-fb05-4246-86e5-2b8dbdbffd04","Type":"ContainerStarted","Data":"1cd8ed7937d830ede8d7155c1487cbb99dae195760eb8c9562dc5f8205a3ac76"} Dec 16 13:08:23 crc kubenswrapper[4757]: I1216 13:08:23.522624 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f64c6bbf7-pnthz" event={"ID":"cac5be05-fb05-4246-86e5-2b8dbdbffd04","Type":"ContainerStarted","Data":"617ee8c7363bb6307047560a02eb3b530e6bbc6ea1729c9016946000bbbc4407"} Dec 16 13:08:23 crc kubenswrapper[4757]: I1216 13:08:23.523865 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:23 crc kubenswrapper[4757]: I1216 13:08:23.529454 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2b303e0-e076-4589-9fb3-b51f998a293e","Type":"ContainerStarted","Data":"3bb81995bc347d68533e97252a2cc33e2f74bb8c80dcf83e7cd0f58f8468268e"} Dec 16 13:08:23 crc kubenswrapper[4757]: I1216 13:08:23.547415 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-9bd99879-lw2rw" event={"ID":"7d1df7bf-6c39-4e49-873a-701b8c05f900","Type":"ContainerStarted","Data":"c08b5457eb2828240cb156856c1fed7fcdfe0324293f4360ffe268baf4004b8c"} Dec 16 13:08:23 crc kubenswrapper[4757]: I1216 13:08:23.555327 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d90963c3-d526-4ee8-a945-9c8cb8868a9a","Type":"ContainerStarted","Data":"babaa5b68cae83e0260090338c0609eea17694a25b5ffba7ea5f1fc381da24dd"} Dec 16 13:08:23 crc kubenswrapper[4757]: I1216 13:08:23.572841 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7f64c6bbf7-pnthz" podStartSLOduration=6.572818684 podStartE2EDuration="6.572818684s" podCreationTimestamp="2025-12-16 13:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:08:23.551149933 +0000 UTC m=+1288.978893739" watchObservedRunningTime="2025-12-16 13:08:23.572818684 +0000 UTC m=+1289.000562480" Dec 16 13:08:23 crc kubenswrapper[4757]: I1216 13:08:23.597506 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-9bd99879-lw2rw" podStartSLOduration=15.378851587 podStartE2EDuration="22.597482703s" podCreationTimestamp="2025-12-16 13:08:01 +0000 UTC" firstStartedPulling="2025-12-16 13:08:12.919641602 +0000 UTC m=+1278.347385398" lastFinishedPulling="2025-12-16 13:08:20.138272718 +0000 UTC m=+1285.566016514" observedRunningTime="2025-12-16 13:08:23.571620067 +0000 UTC m=+1288.999363863" watchObservedRunningTime="2025-12-16 13:08:23.597482703 +0000 UTC m=+1289.025226499" Dec 16 13:08:24 crc kubenswrapper[4757]: I1216 13:08:24.375277 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7cfff6bfd-qz5sk" podUID="fd29da8f-05a6-43a9-a943-c6a8a4ef8479" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 13:08:24 crc kubenswrapper[4757]: I1216 13:08:24.385574 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7cfff6bfd-qz5sk" podUID="fd29da8f-05a6-43a9-a943-c6a8a4ef8479" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 13:08:24 crc kubenswrapper[4757]: I1216 13:08:24.573375 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2b303e0-e076-4589-9fb3-b51f998a293e","Type":"ContainerStarted","Data":"9e835d62fabb5aa9965dac0b237268d5f1fb69ff7e7b522b9ffad8792d5c35cc"} Dec 16 13:08:24 crc kubenswrapper[4757]: I1216 13:08:24.994904 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d708ede9-7a1e-4baa-9c7a-21bb0007c6ee" path="/var/lib/kubelet/pods/d708ede9-7a1e-4baa-9c7a-21bb0007c6ee/volumes" Dec 16 13:08:25 crc kubenswrapper[4757]: I1216 13:08:25.424234 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 16 13:08:25 crc kubenswrapper[4757]: I1216 13:08:25.426590 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="4aae0516-59bd-448d-80be-c0df3dc002b9" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.158:8080/\": dial tcp 10.217.0.158:8080: connect: connection refused" Dec 16 13:08:25 crc kubenswrapper[4757]: I1216 13:08:25.604862 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d90963c3-d526-4ee8-a945-9c8cb8868a9a","Type":"ContainerStarted","Data":"d728a1c8748643f0317fc1a8b2031e352883effaf8b7604f5eda0cc1aa8a8ed4"} Dec 16 13:08:25 crc kubenswrapper[4757]: I1216 13:08:25.605366 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-57f49b544b-vlhzn" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:08:25 crc kubenswrapper[4757]: I1216 13:08:25.605627 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-57f49b544b-vlhzn" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:08:26 crc kubenswrapper[4757]: I1216 13:08:26.622519 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e2b303e0-e076-4589-9fb3-b51f998a293e","Type":"ContainerStarted","Data":"58ec5a2cf24f2b2c7d005e884f2adf24beecacbfa89f8b470c37ceb1f9bb845e"} Dec 16 13:08:26 crc kubenswrapper[4757]: I1216 13:08:26.624104 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 16 13:08:26 crc kubenswrapper[4757]: I1216 13:08:26.643919 4757 scope.go:117] "RemoveContainer" containerID="7b51940f94242a597be3710ae28c955c6f0365a83cf3f9f61b8c9c5028a38820" Dec 16 13:08:26 crc kubenswrapper[4757]: I1216 13:08:26.645569 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.6455536649999996 podStartE2EDuration="5.645553665s" podCreationTimestamp="2025-12-16 13:08:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:08:26.64538127 +0000 UTC m=+1292.073125066" watchObservedRunningTime="2025-12-16 13:08:26.645553665 +0000 UTC m=+1292.073297461" Dec 16 13:08:26 crc kubenswrapper[4757]: I1216 13:08:26.696213 4757 scope.go:117] "RemoveContainer" containerID="b315292e8173318c3d3f990fada0b084b2534996da01a53e12df29a7b5887d85" Dec 16 13:08:26 crc kubenswrapper[4757]: I1216 13:08:26.783581 4757 scope.go:117] "RemoveContainer" containerID="8e24f74c89df2d4e8232014d000906bca61c7c009e60bd5840003f72005545df" Dec 16 13:08:27 crc kubenswrapper[4757]: I1216 13:08:27.371252 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7cfff6bfd-qz5sk" podUID="fd29da8f-05a6-43a9-a943-c6a8a4ef8479" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 13:08:27 crc kubenswrapper[4757]: I1216 13:08:27.371281 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7cfff6bfd-qz5sk" podUID="fd29da8f-05a6-43a9-a943-c6a8a4ef8479" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 13:08:27 crc kubenswrapper[4757]: I1216 13:08:27.603556 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57f49b544b-vlhzn" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:08:27 crc kubenswrapper[4757]: I1216 13:08:27.603816 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57f49b544b-vlhzn" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:08:27 crc kubenswrapper[4757]: I1216 13:08:27.637277 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d90963c3-d526-4ee8-a945-9c8cb8868a9a","Type":"ContainerStarted","Data":"12223a0b7ba10d366b1b7eed777f9063629bbd4d6fc3eb3f90643d1c745b2103"} Dec 16 13:08:27 crc kubenswrapper[4757]: I1216 13:08:27.637313 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 13:08:29 crc kubenswrapper[4757]: I1216 13:08:29.385240 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7cfff6bfd-qz5sk" podUID="fd29da8f-05a6-43a9-a943-c6a8a4ef8479" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:08:29 crc kubenswrapper[4757]: I1216 13:08:29.391274 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7cfff6bfd-qz5sk" podUID="fd29da8f-05a6-43a9-a943-c6a8a4ef8479" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 13:08:30 crc kubenswrapper[4757]: I1216 13:08:30.424970 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="4aae0516-59bd-448d-80be-c0df3dc002b9" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.158:8080/\": dial tcp 10.217.0.158:8080: connect: connection refused" Dec 16 13:08:30 crc kubenswrapper[4757]: I1216 13:08:30.689283 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-57f49b544b-vlhzn" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:08:30 crc kubenswrapper[4757]: I1216 13:08:30.689300 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-57f49b544b-vlhzn" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:08:31 crc kubenswrapper[4757]: I1216 13:08:31.740567 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:31 crc kubenswrapper[4757]: I1216 13:08:31.766712 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.812393107 podStartE2EDuration="17.766683377s" podCreationTimestamp="2025-12-16 13:08:14 +0000 UTC" firstStartedPulling="2025-12-16 13:08:16.312847599 +0000 UTC m=+1281.740591395" lastFinishedPulling="2025-12-16 13:08:26.267137869 +0000 UTC m=+1291.694881665" observedRunningTime="2025-12-16 13:08:27.665781092 +0000 UTC m=+1293.093524898" watchObservedRunningTime="2025-12-16 13:08:31.766683377 +0000 UTC m=+1297.194427173" Dec 16 13:08:31 crc kubenswrapper[4757]: I1216 13:08:31.809632 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-66866d5f44-2mhtb" Dec 16 13:08:31 crc kubenswrapper[4757]: I1216 13:08:31.996429 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 16 13:08:31 crc kubenswrapper[4757]: E1216 13:08:31.996849 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d708ede9-7a1e-4baa-9c7a-21bb0007c6ee" containerName="dnsmasq-dns" Dec 16 13:08:31 crc kubenswrapper[4757]: I1216 13:08:31.996861 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d708ede9-7a1e-4baa-9c7a-21bb0007c6ee" containerName="dnsmasq-dns" Dec 16 13:08:31 crc kubenswrapper[4757]: E1216 13:08:31.996873 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d708ede9-7a1e-4baa-9c7a-21bb0007c6ee" containerName="init" Dec 16 13:08:31 crc kubenswrapper[4757]: I1216 13:08:31.996879 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d708ede9-7a1e-4baa-9c7a-21bb0007c6ee" containerName="init" Dec 16 13:08:31 crc kubenswrapper[4757]: I1216 13:08:31.997062 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="d708ede9-7a1e-4baa-9c7a-21bb0007c6ee" containerName="dnsmasq-dns" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.028731 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.028846 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.042450 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.042706 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-8mvgp" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.042966 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.133156 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.133276 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vr5n\" (UniqueName: \"kubernetes.io/projected/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-kube-api-access-9vr5n\") pod \"openstackclient\" (UID: \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.133305 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-openstack-config\") pod \"openstackclient\" (UID: \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.133344 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-openstack-config-secret\") pod \"openstackclient\" (UID: \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.235253 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vr5n\" (UniqueName: \"kubernetes.io/projected/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-kube-api-access-9vr5n\") pod \"openstackclient\" (UID: \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.235295 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-openstack-config\") pod \"openstackclient\" (UID: \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.235322 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-openstack-config-secret\") pod \"openstackclient\" (UID: \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.235402 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.236297 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-openstack-config\") pod \"openstackclient\" (UID: \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.256964 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-openstack-config-secret\") pod \"openstackclient\" (UID: \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.257450 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.263450 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vr5n\" (UniqueName: \"kubernetes.io/projected/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-kube-api-access-9vr5n\") pod \"openstackclient\" (UID: \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.328914 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.329416 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.347921 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.382282 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7cfff6bfd-qz5sk" podUID="fd29da8f-05a6-43a9-a943-c6a8a4ef8479" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.387299 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.388973 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7cfff6bfd-qz5sk" podUID="fd29da8f-05a6-43a9-a943-c6a8a4ef8479" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.389967 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.439681 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s65jt\" (UniqueName: \"kubernetes.io/projected/891886a7-6bbd-48b7-8460-a1467bae862a-kube-api-access-s65jt\") pod \"openstackclient\" (UID: \"891886a7-6bbd-48b7-8460-a1467bae862a\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.439772 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/891886a7-6bbd-48b7-8460-a1467bae862a-openstack-config\") pod \"openstackclient\" (UID: \"891886a7-6bbd-48b7-8460-a1467bae862a\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.439834 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891886a7-6bbd-48b7-8460-a1467bae862a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"891886a7-6bbd-48b7-8460-a1467bae862a\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.439915 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/891886a7-6bbd-48b7-8460-a1467bae862a-openstack-config-secret\") pod \"openstackclient\" (UID: \"891886a7-6bbd-48b7-8460-a1467bae862a\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.448092 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.454437 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.542248 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891886a7-6bbd-48b7-8460-a1467bae862a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"891886a7-6bbd-48b7-8460-a1467bae862a\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.542645 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/891886a7-6bbd-48b7-8460-a1467bae862a-openstack-config-secret\") pod \"openstackclient\" (UID: \"891886a7-6bbd-48b7-8460-a1467bae862a\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.544299 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s65jt\" (UniqueName: \"kubernetes.io/projected/891886a7-6bbd-48b7-8460-a1467bae862a-kube-api-access-s65jt\") pod \"openstackclient\" (UID: \"891886a7-6bbd-48b7-8460-a1467bae862a\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.544364 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/891886a7-6bbd-48b7-8460-a1467bae862a-openstack-config\") pod \"openstackclient\" (UID: \"891886a7-6bbd-48b7-8460-a1467bae862a\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.545254 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/891886a7-6bbd-48b7-8460-a1467bae862a-openstack-config\") pod \"openstackclient\" (UID: \"891886a7-6bbd-48b7-8460-a1467bae862a\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.547284 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891886a7-6bbd-48b7-8460-a1467bae862a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"891886a7-6bbd-48b7-8460-a1467bae862a\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.574838 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cfff6bfd-qz5sk" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.589459 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/891886a7-6bbd-48b7-8460-a1467bae862a-openstack-config-secret\") pod \"openstackclient\" (UID: \"891886a7-6bbd-48b7-8460-a1467bae862a\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.592671 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s65jt\" (UniqueName: \"kubernetes.io/projected/891886a7-6bbd-48b7-8460-a1467bae862a-kube-api-access-s65jt\") pod \"openstackclient\" (UID: \"891886a7-6bbd-48b7-8460-a1467bae862a\") " pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.653705 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57f49b544b-vlhzn" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:08:32 crc kubenswrapper[4757]: E1216 13:08:32.672498 4757 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 16 13:08:32 crc kubenswrapper[4757]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_b8f5eee5-2fd8-46d5-8c00-308d34a0828d_0(d59aa82184d4862b00f40bba5097c3b72253f552c5f03423f04bd71d1f3e1710): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d59aa82184d4862b00f40bba5097c3b72253f552c5f03423f04bd71d1f3e1710" Netns:"/var/run/netns/f61b5f25-7561-412a-be12-394af22b480d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=d59aa82184d4862b00f40bba5097c3b72253f552c5f03423f04bd71d1f3e1710;K8S_POD_UID=b8f5eee5-2fd8-46d5-8c00-308d34a0828d" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/b8f5eee5-2fd8-46d5-8c00-308d34a0828d]: expected pod UID "b8f5eee5-2fd8-46d5-8c00-308d34a0828d" but got "891886a7-6bbd-48b7-8460-a1467bae862a" from Kube API Dec 16 13:08:32 crc kubenswrapper[4757]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 16 13:08:32 crc kubenswrapper[4757]: > Dec 16 13:08:32 crc kubenswrapper[4757]: E1216 13:08:32.672779 4757 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 16 13:08:32 crc kubenswrapper[4757]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_b8f5eee5-2fd8-46d5-8c00-308d34a0828d_0(d59aa82184d4862b00f40bba5097c3b72253f552c5f03423f04bd71d1f3e1710): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d59aa82184d4862b00f40bba5097c3b72253f552c5f03423f04bd71d1f3e1710" Netns:"/var/run/netns/f61b5f25-7561-412a-be12-394af22b480d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=d59aa82184d4862b00f40bba5097c3b72253f552c5f03423f04bd71d1f3e1710;K8S_POD_UID=b8f5eee5-2fd8-46d5-8c00-308d34a0828d" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/b8f5eee5-2fd8-46d5-8c00-308d34a0828d]: expected pod UID "b8f5eee5-2fd8-46d5-8c00-308d34a0828d" but got "891886a7-6bbd-48b7-8460-a1467bae862a" from Kube API Dec 16 13:08:32 crc kubenswrapper[4757]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 16 13:08:32 crc kubenswrapper[4757]: > pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.679678 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-57f49b544b-vlhzn"] Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.682717 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.683917 4757 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.684164 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-57f49b544b-vlhzn" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" containerName="barbican-api-log" containerID="cri-o://3326627538ad35c4498b98fc3e33b430703af1aaeda5cdcbbcdc565d73fdc2e3" gracePeriod=30 Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.685319 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-57f49b544b-vlhzn" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" containerName="barbican-api" containerID="cri-o://f467372a9550884615b31602ab97a8c56899f33e3ebf192ae76e32005d8abdd0" gracePeriod=30 Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.713548 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.747899 4757 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b8f5eee5-2fd8-46d5-8c00-308d34a0828d" podUID="891886a7-6bbd-48b7-8460-a1467bae862a" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.767388 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57f49b544b-vlhzn" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": EOF" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.772234 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.821655 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.870930 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-openstack-config-secret\") pod \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\" (UID: \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\") " Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.871361 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vr5n\" (UniqueName: \"kubernetes.io/projected/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-kube-api-access-9vr5n\") pod \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\" (UID: \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\") " Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.871689 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-openstack-config\") pod \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\" (UID: \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\") " Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.871931 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-combined-ca-bundle\") pod \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\" (UID: \"b8f5eee5-2fd8-46d5-8c00-308d34a0828d\") " Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.884890 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b8f5eee5-2fd8-46d5-8c00-308d34a0828d" (UID: "b8f5eee5-2fd8-46d5-8c00-308d34a0828d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.895914 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8f5eee5-2fd8-46d5-8c00-308d34a0828d" (UID: "b8f5eee5-2fd8-46d5-8c00-308d34a0828d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.916455 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-kube-api-access-9vr5n" (OuterVolumeSpecName: "kube-api-access-9vr5n") pod "b8f5eee5-2fd8-46d5-8c00-308d34a0828d" (UID: "b8f5eee5-2fd8-46d5-8c00-308d34a0828d"). InnerVolumeSpecName "kube-api-access-9vr5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.943207 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b8f5eee5-2fd8-46d5-8c00-308d34a0828d" (UID: "b8f5eee5-2fd8-46d5-8c00-308d34a0828d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.975287 4757 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.975325 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vr5n\" (UniqueName: \"kubernetes.io/projected/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-kube-api-access-9vr5n\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.975335 4757 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.975344 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f5eee5-2fd8-46d5-8c00-308d34a0828d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:32 crc kubenswrapper[4757]: I1216 13:08:32.978376 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8f5eee5-2fd8-46d5-8c00-308d34a0828d" path="/var/lib/kubelet/pods/b8f5eee5-2fd8-46d5-8c00-308d34a0828d/volumes" Dec 16 13:08:33 crc kubenswrapper[4757]: I1216 13:08:33.536535 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 13:08:33 crc kubenswrapper[4757]: I1216 13:08:33.693599 4757 generic.go:334] "Generic (PLEG): container finished" podID="3687417c-3bcd-48ea-b328-440ff4005a02" containerID="3326627538ad35c4498b98fc3e33b430703af1aaeda5cdcbbcdc565d73fdc2e3" exitCode=143 Dec 16 13:08:33 crc kubenswrapper[4757]: I1216 13:08:33.693759 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57f49b544b-vlhzn" event={"ID":"3687417c-3bcd-48ea-b328-440ff4005a02","Type":"ContainerDied","Data":"3326627538ad35c4498b98fc3e33b430703af1aaeda5cdcbbcdc565d73fdc2e3"} Dec 16 13:08:33 crc kubenswrapper[4757]: I1216 13:08:33.695536 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 13:08:33 crc kubenswrapper[4757]: I1216 13:08:33.695740 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"891886a7-6bbd-48b7-8460-a1467bae862a","Type":"ContainerStarted","Data":"ef1103e09587f671136f7c5b315d3fcfacbdc7b93457dee483588b7fbbad7b0b"} Dec 16 13:08:33 crc kubenswrapper[4757]: I1216 13:08:33.705189 4757 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b8f5eee5-2fd8-46d5-8c00-308d34a0828d" podUID="891886a7-6bbd-48b7-8460-a1467bae862a" Dec 16 13:08:34 crc kubenswrapper[4757]: I1216 13:08:34.129418 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:08:34 crc kubenswrapper[4757]: I1216 13:08:34.213597 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5494d9c5f6-8dwpv" Dec 16 13:08:35 crc kubenswrapper[4757]: I1216 13:08:35.773904 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="e2b303e0-e076-4589-9fb3-b51f998a293e" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.165:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 13:08:36 crc kubenswrapper[4757]: I1216 13:08:36.016792 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 16 13:08:36 crc kubenswrapper[4757]: I1216 13:08:36.089320 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 13:08:36 crc kubenswrapper[4757]: I1216 13:08:36.720272 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4aae0516-59bd-448d-80be-c0df3dc002b9" containerName="cinder-scheduler" containerID="cri-o://8bae132a84fa051a54bfa956d7fd5105f138ee7d6175341fcc8c961ba95ccb30" gracePeriod=30 Dec 16 13:08:36 crc kubenswrapper[4757]: I1216 13:08:36.720338 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4aae0516-59bd-448d-80be-c0df3dc002b9" containerName="probe" containerID="cri-o://dc8d830b3117d76d282e99f26579e97228d766095d176f87980a668cca428d15" gracePeriod=30 Dec 16 13:08:36 crc kubenswrapper[4757]: I1216 13:08:36.774262 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="e2b303e0-e076-4589-9fb3-b51f998a293e" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.165:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:08:38 crc kubenswrapper[4757]: I1216 13:08:38.293817 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57f49b544b-vlhzn" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": read tcp 10.217.0.2:36214->10.217.0.156:9311: read: connection reset by peer" Dec 16 13:08:38 crc kubenswrapper[4757]: I1216 13:08:38.293906 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57f49b544b-vlhzn" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": read tcp 10.217.0.2:36216->10.217.0.156:9311: read: connection reset by peer" Dec 16 13:08:39 crc kubenswrapper[4757]: I1216 13:08:39.806324 4757 generic.go:334] "Generic (PLEG): container finished" podID="3687417c-3bcd-48ea-b328-440ff4005a02" containerID="f467372a9550884615b31602ab97a8c56899f33e3ebf192ae76e32005d8abdd0" exitCode=0 Dec 16 13:08:39 crc kubenswrapper[4757]: I1216 13:08:39.806747 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57f49b544b-vlhzn" event={"ID":"3687417c-3bcd-48ea-b328-440ff4005a02","Type":"ContainerDied","Data":"f467372a9550884615b31602ab97a8c56899f33e3ebf192ae76e32005d8abdd0"} Dec 16 13:08:39 crc kubenswrapper[4757]: I1216 13:08:39.816410 4757 generic.go:334] "Generic (PLEG): container finished" podID="4aae0516-59bd-448d-80be-c0df3dc002b9" containerID="dc8d830b3117d76d282e99f26579e97228d766095d176f87980a668cca428d15" exitCode=0 Dec 16 13:08:39 crc kubenswrapper[4757]: I1216 13:08:39.816470 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4aae0516-59bd-448d-80be-c0df3dc002b9","Type":"ContainerDied","Data":"dc8d830b3117d76d282e99f26579e97228d766095d176f87980a668cca428d15"} Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.095095 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.266840 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2c2z\" (UniqueName: \"kubernetes.io/projected/3687417c-3bcd-48ea-b328-440ff4005a02-kube-api-access-n2c2z\") pod \"3687417c-3bcd-48ea-b328-440ff4005a02\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.267240 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3687417c-3bcd-48ea-b328-440ff4005a02-combined-ca-bundle\") pod \"3687417c-3bcd-48ea-b328-440ff4005a02\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.267368 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3687417c-3bcd-48ea-b328-440ff4005a02-config-data-custom\") pod \"3687417c-3bcd-48ea-b328-440ff4005a02\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.267473 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3687417c-3bcd-48ea-b328-440ff4005a02-logs\") pod \"3687417c-3bcd-48ea-b328-440ff4005a02\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.267643 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3687417c-3bcd-48ea-b328-440ff4005a02-config-data\") pod \"3687417c-3bcd-48ea-b328-440ff4005a02\" (UID: \"3687417c-3bcd-48ea-b328-440ff4005a02\") " Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.268426 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3687417c-3bcd-48ea-b328-440ff4005a02-logs" (OuterVolumeSpecName: "logs") pod "3687417c-3bcd-48ea-b328-440ff4005a02" (UID: "3687417c-3bcd-48ea-b328-440ff4005a02"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.278501 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3687417c-3bcd-48ea-b328-440ff4005a02-kube-api-access-n2c2z" (OuterVolumeSpecName: "kube-api-access-n2c2z") pod "3687417c-3bcd-48ea-b328-440ff4005a02" (UID: "3687417c-3bcd-48ea-b328-440ff4005a02"). InnerVolumeSpecName "kube-api-access-n2c2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.296876 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3687417c-3bcd-48ea-b328-440ff4005a02-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3687417c-3bcd-48ea-b328-440ff4005a02" (UID: "3687417c-3bcd-48ea-b328-440ff4005a02"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.376223 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2c2z\" (UniqueName: \"kubernetes.io/projected/3687417c-3bcd-48ea-b328-440ff4005a02-kube-api-access-n2c2z\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.376261 4757 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3687417c-3bcd-48ea-b328-440ff4005a02-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.376270 4757 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3687417c-3bcd-48ea-b328-440ff4005a02-logs\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.457806 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3687417c-3bcd-48ea-b328-440ff4005a02-config-data" (OuterVolumeSpecName: "config-data") pod "3687417c-3bcd-48ea-b328-440ff4005a02" (UID: "3687417c-3bcd-48ea-b328-440ff4005a02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.482085 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3687417c-3bcd-48ea-b328-440ff4005a02-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.490825 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3687417c-3bcd-48ea-b328-440ff4005a02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3687417c-3bcd-48ea-b328-440ff4005a02" (UID: "3687417c-3bcd-48ea-b328-440ff4005a02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.590142 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3687417c-3bcd-48ea-b328-440ff4005a02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.868763 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57f49b544b-vlhzn" event={"ID":"3687417c-3bcd-48ea-b328-440ff4005a02","Type":"ContainerDied","Data":"6e3528c57c1a61809818466ba59afca126ee19375132550b8ac3c4748b35c287"} Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.869122 4757 scope.go:117] "RemoveContainer" containerID="f467372a9550884615b31602ab97a8c56899f33e3ebf192ae76e32005d8abdd0" Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.869251 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57f49b544b-vlhzn" Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.896268 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.898849 4757 generic.go:334] "Generic (PLEG): container finished" podID="4aae0516-59bd-448d-80be-c0df3dc002b9" containerID="8bae132a84fa051a54bfa956d7fd5105f138ee7d6175341fcc8c961ba95ccb30" exitCode=0 Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.898891 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4aae0516-59bd-448d-80be-c0df3dc002b9","Type":"ContainerDied","Data":"8bae132a84fa051a54bfa956d7fd5105f138ee7d6175341fcc8c961ba95ccb30"} Dec 16 13:08:40 crc kubenswrapper[4757]: I1216 13:08:40.981881 4757 scope.go:117] "RemoveContainer" containerID="3326627538ad35c4498b98fc3e33b430703af1aaeda5cdcbbcdc565d73fdc2e3" Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.005643 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-config-data-custom\") pod \"4aae0516-59bd-448d-80be-c0df3dc002b9\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.005715 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-scripts\") pod \"4aae0516-59bd-448d-80be-c0df3dc002b9\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.005864 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-combined-ca-bundle\") pod \"4aae0516-59bd-448d-80be-c0df3dc002b9\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.005958 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-config-data\") pod \"4aae0516-59bd-448d-80be-c0df3dc002b9\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.006017 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4aae0516-59bd-448d-80be-c0df3dc002b9-etc-machine-id\") pod \"4aae0516-59bd-448d-80be-c0df3dc002b9\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.006110 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66vsj\" (UniqueName: \"kubernetes.io/projected/4aae0516-59bd-448d-80be-c0df3dc002b9-kube-api-access-66vsj\") pod \"4aae0516-59bd-448d-80be-c0df3dc002b9\" (UID: \"4aae0516-59bd-448d-80be-c0df3dc002b9\") " Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.015385 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4aae0516-59bd-448d-80be-c0df3dc002b9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4aae0516-59bd-448d-80be-c0df3dc002b9" (UID: "4aae0516-59bd-448d-80be-c0df3dc002b9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.027439 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4aae0516-59bd-448d-80be-c0df3dc002b9" (UID: "4aae0516-59bd-448d-80be-c0df3dc002b9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.031500 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aae0516-59bd-448d-80be-c0df3dc002b9-kube-api-access-66vsj" (OuterVolumeSpecName: "kube-api-access-66vsj") pod "4aae0516-59bd-448d-80be-c0df3dc002b9" (UID: "4aae0516-59bd-448d-80be-c0df3dc002b9"). InnerVolumeSpecName "kube-api-access-66vsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.041280 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-scripts" (OuterVolumeSpecName: "scripts") pod "4aae0516-59bd-448d-80be-c0df3dc002b9" (UID: "4aae0516-59bd-448d-80be-c0df3dc002b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.082770 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-57f49b544b-vlhzn"] Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.082800 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-57f49b544b-vlhzn"] Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.109585 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.109621 4757 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4aae0516-59bd-448d-80be-c0df3dc002b9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.109635 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66vsj\" (UniqueName: \"kubernetes.io/projected/4aae0516-59bd-448d-80be-c0df3dc002b9-kube-api-access-66vsj\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.109649 4757 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.170347 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4aae0516-59bd-448d-80be-c0df3dc002b9" (UID: "4aae0516-59bd-448d-80be-c0df3dc002b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.211595 4757 scope.go:117] "RemoveContainer" containerID="dc8d830b3117d76d282e99f26579e97228d766095d176f87980a668cca428d15" Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.212828 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.239180 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-config-data" (OuterVolumeSpecName: "config-data") pod "4aae0516-59bd-448d-80be-c0df3dc002b9" (UID: "4aae0516-59bd-448d-80be-c0df3dc002b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.275301 4757 scope.go:117] "RemoveContainer" containerID="8bae132a84fa051a54bfa956d7fd5105f138ee7d6175341fcc8c961ba95ccb30" Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.315191 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aae0516-59bd-448d-80be-c0df3dc002b9-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.780251 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="e2b303e0-e076-4589-9fb3-b51f998a293e" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.165:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.815063 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.927072 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4aae0516-59bd-448d-80be-c0df3dc002b9","Type":"ContainerDied","Data":"e577e5b2021d61edc71a91c4a143880aa7d69a8e06345475deae8f37aa3ad810"} Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.927170 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.965930 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 13:08:41 crc kubenswrapper[4757]: I1216 13:08:41.994481 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.034501 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.060699 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 13:08:42 crc kubenswrapper[4757]: E1216 13:08:42.061193 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aae0516-59bd-448d-80be-c0df3dc002b9" containerName="cinder-scheduler" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.061221 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aae0516-59bd-448d-80be-c0df3dc002b9" containerName="cinder-scheduler" Dec 16 13:08:42 crc kubenswrapper[4757]: E1216 13:08:42.061236 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aae0516-59bd-448d-80be-c0df3dc002b9" containerName="probe" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.061244 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aae0516-59bd-448d-80be-c0df3dc002b9" containerName="probe" Dec 16 13:08:42 crc kubenswrapper[4757]: E1216 13:08:42.061273 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" containerName="barbican-api" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.061281 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" containerName="barbican-api" Dec 16 13:08:42 crc kubenswrapper[4757]: E1216 13:08:42.061298 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" containerName="barbican-api-log" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.061307 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" containerName="barbican-api-log" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.061542 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" containerName="barbican-api-log" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.061571 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aae0516-59bd-448d-80be-c0df3dc002b9" containerName="probe" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.061583 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" containerName="barbican-api" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.061603 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aae0516-59bd-448d-80be-c0df3dc002b9" containerName="cinder-scheduler" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.062824 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.068808 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.088371 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.265623 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9065cfba-e560-471d-bb64-e20502e5b5d6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9065cfba-e560-471d-bb64-e20502e5b5d6\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.265671 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6pbc\" (UniqueName: \"kubernetes.io/projected/9065cfba-e560-471d-bb64-e20502e5b5d6-kube-api-access-z6pbc\") pod \"cinder-scheduler-0\" (UID: \"9065cfba-e560-471d-bb64-e20502e5b5d6\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.265700 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9065cfba-e560-471d-bb64-e20502e5b5d6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9065cfba-e560-471d-bb64-e20502e5b5d6\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.265749 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9065cfba-e560-471d-bb64-e20502e5b5d6-scripts\") pod \"cinder-scheduler-0\" (UID: \"9065cfba-e560-471d-bb64-e20502e5b5d6\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.265824 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9065cfba-e560-471d-bb64-e20502e5b5d6-config-data\") pod \"cinder-scheduler-0\" (UID: \"9065cfba-e560-471d-bb64-e20502e5b5d6\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.265847 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9065cfba-e560-471d-bb64-e20502e5b5d6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9065cfba-e560-471d-bb64-e20502e5b5d6\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.368587 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9065cfba-e560-471d-bb64-e20502e5b5d6-scripts\") pod \"cinder-scheduler-0\" (UID: \"9065cfba-e560-471d-bb64-e20502e5b5d6\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.369162 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9065cfba-e560-471d-bb64-e20502e5b5d6-config-data\") pod \"cinder-scheduler-0\" (UID: \"9065cfba-e560-471d-bb64-e20502e5b5d6\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.369230 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9065cfba-e560-471d-bb64-e20502e5b5d6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9065cfba-e560-471d-bb64-e20502e5b5d6\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.369409 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9065cfba-e560-471d-bb64-e20502e5b5d6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9065cfba-e560-471d-bb64-e20502e5b5d6\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.369449 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6pbc\" (UniqueName: \"kubernetes.io/projected/9065cfba-e560-471d-bb64-e20502e5b5d6-kube-api-access-z6pbc\") pod \"cinder-scheduler-0\" (UID: \"9065cfba-e560-471d-bb64-e20502e5b5d6\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.369481 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9065cfba-e560-471d-bb64-e20502e5b5d6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9065cfba-e560-471d-bb64-e20502e5b5d6\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.369604 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9065cfba-e560-471d-bb64-e20502e5b5d6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9065cfba-e560-471d-bb64-e20502e5b5d6\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.382054 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9065cfba-e560-471d-bb64-e20502e5b5d6-config-data\") pod \"cinder-scheduler-0\" (UID: \"9065cfba-e560-471d-bb64-e20502e5b5d6\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.388001 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9065cfba-e560-471d-bb64-e20502e5b5d6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9065cfba-e560-471d-bb64-e20502e5b5d6\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.392224 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9065cfba-e560-471d-bb64-e20502e5b5d6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9065cfba-e560-471d-bb64-e20502e5b5d6\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.392865 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6pbc\" (UniqueName: \"kubernetes.io/projected/9065cfba-e560-471d-bb64-e20502e5b5d6-kube-api-access-z6pbc\") pod \"cinder-scheduler-0\" (UID: \"9065cfba-e560-471d-bb64-e20502e5b5d6\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.410588 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9065cfba-e560-471d-bb64-e20502e5b5d6-scripts\") pod \"cinder-scheduler-0\" (UID: \"9065cfba-e560-471d-bb64-e20502e5b5d6\") " pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.707396 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.959175 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3687417c-3bcd-48ea-b328-440ff4005a02" path="/var/lib/kubelet/pods/3687417c-3bcd-48ea-b328-440ff4005a02/volumes" Dec 16 13:08:42 crc kubenswrapper[4757]: I1216 13:08:42.959745 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aae0516-59bd-448d-80be-c0df3dc002b9" path="/var/lib/kubelet/pods/4aae0516-59bd-448d-80be-c0df3dc002b9/volumes" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.267551 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.613108 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-544dfc5bc-8q666"] Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.614888 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.621237 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.621689 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.621861 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.645688 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-544dfc5bc-8q666"] Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.733956 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpj4f\" (UniqueName: \"kubernetes.io/projected/38a8b3dc-7995-4851-96db-0fb6749669b9-kube-api-access-xpj4f\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.734260 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a8b3dc-7995-4851-96db-0fb6749669b9-config-data\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.734290 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38a8b3dc-7995-4851-96db-0fb6749669b9-public-tls-certs\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.734342 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38a8b3dc-7995-4851-96db-0fb6749669b9-internal-tls-certs\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.734394 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38a8b3dc-7995-4851-96db-0fb6749669b9-run-httpd\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.734424 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/38a8b3dc-7995-4851-96db-0fb6749669b9-etc-swift\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.734471 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38a8b3dc-7995-4851-96db-0fb6749669b9-log-httpd\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.734506 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a8b3dc-7995-4851-96db-0fb6749669b9-combined-ca-bundle\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.836318 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a8b3dc-7995-4851-96db-0fb6749669b9-config-data\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.836698 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38a8b3dc-7995-4851-96db-0fb6749669b9-public-tls-certs\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.836763 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38a8b3dc-7995-4851-96db-0fb6749669b9-internal-tls-certs\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.836808 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38a8b3dc-7995-4851-96db-0fb6749669b9-run-httpd\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.836833 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/38a8b3dc-7995-4851-96db-0fb6749669b9-etc-swift\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.836882 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38a8b3dc-7995-4851-96db-0fb6749669b9-log-httpd\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.836925 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a8b3dc-7995-4851-96db-0fb6749669b9-combined-ca-bundle\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.836975 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpj4f\" (UniqueName: \"kubernetes.io/projected/38a8b3dc-7995-4851-96db-0fb6749669b9-kube-api-access-xpj4f\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.838920 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38a8b3dc-7995-4851-96db-0fb6749669b9-run-httpd\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.840809 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38a8b3dc-7995-4851-96db-0fb6749669b9-log-httpd\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.848550 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38a8b3dc-7995-4851-96db-0fb6749669b9-public-tls-certs\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.848720 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a8b3dc-7995-4851-96db-0fb6749669b9-config-data\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.860580 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38a8b3dc-7995-4851-96db-0fb6749669b9-internal-tls-certs\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.869853 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a8b3dc-7995-4851-96db-0fb6749669b9-combined-ca-bundle\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.874519 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/38a8b3dc-7995-4851-96db-0fb6749669b9-etc-swift\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.877945 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpj4f\" (UniqueName: \"kubernetes.io/projected/38a8b3dc-7995-4851-96db-0fb6749669b9-kube-api-access-xpj4f\") pod \"swift-proxy-544dfc5bc-8q666\" (UID: \"38a8b3dc-7995-4851-96db-0fb6749669b9\") " pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:45 crc kubenswrapper[4757]: I1216 13:08:45.947141 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.168911 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-lhdf9"] Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.170425 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lhdf9" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.192831 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-lhdf9"] Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.259828 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-q6drx"] Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.263071 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-q6drx" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.272347 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-q6drx"] Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.336051 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-4656-account-create-update-jztlz"] Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.337095 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4656-account-create-update-jztlz" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.340650 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.365162 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnqfq\" (UniqueName: \"kubernetes.io/projected/efe43680-5772-43bd-9ca1-4cb0245cae49-kube-api-access-nnqfq\") pod \"nova-api-db-create-lhdf9\" (UID: \"efe43680-5772-43bd-9ca1-4cb0245cae49\") " pod="openstack/nova-api-db-create-lhdf9" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.369439 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efe43680-5772-43bd-9ca1-4cb0245cae49-operator-scripts\") pod \"nova-api-db-create-lhdf9\" (UID: \"efe43680-5772-43bd-9ca1-4cb0245cae49\") " pod="openstack/nova-api-db-create-lhdf9" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.401808 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4656-account-create-update-jztlz"] Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.439150 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-4plp4"] Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.441420 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4plp4" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.459575 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4plp4"] Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.471431 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/427479ca-18b6-4580-a4ef-f85bb4071c88-operator-scripts\") pod \"nova-api-4656-account-create-update-jztlz\" (UID: \"427479ca-18b6-4580-a4ef-f85bb4071c88\") " pod="openstack/nova-api-4656-account-create-update-jztlz" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.471509 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtnq5\" (UniqueName: \"kubernetes.io/projected/1488e9fc-9b52-4515-8fb3-980469c83ae8-kube-api-access-wtnq5\") pod \"nova-cell0-db-create-q6drx\" (UID: \"1488e9fc-9b52-4515-8fb3-980469c83ae8\") " pod="openstack/nova-cell0-db-create-q6drx" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.471540 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnqfq\" (UniqueName: \"kubernetes.io/projected/efe43680-5772-43bd-9ca1-4cb0245cae49-kube-api-access-nnqfq\") pod \"nova-api-db-create-lhdf9\" (UID: \"efe43680-5772-43bd-9ca1-4cb0245cae49\") " pod="openstack/nova-api-db-create-lhdf9" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.471594 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efe43680-5772-43bd-9ca1-4cb0245cae49-operator-scripts\") pod \"nova-api-db-create-lhdf9\" (UID: \"efe43680-5772-43bd-9ca1-4cb0245cae49\") " pod="openstack/nova-api-db-create-lhdf9" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.471622 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfr4m\" (UniqueName: \"kubernetes.io/projected/427479ca-18b6-4580-a4ef-f85bb4071c88-kube-api-access-hfr4m\") pod \"nova-api-4656-account-create-update-jztlz\" (UID: \"427479ca-18b6-4580-a4ef-f85bb4071c88\") " pod="openstack/nova-api-4656-account-create-update-jztlz" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.471674 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1488e9fc-9b52-4515-8fb3-980469c83ae8-operator-scripts\") pod \"nova-cell0-db-create-q6drx\" (UID: \"1488e9fc-9b52-4515-8fb3-980469c83ae8\") " pod="openstack/nova-cell0-db-create-q6drx" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.473117 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efe43680-5772-43bd-9ca1-4cb0245cae49-operator-scripts\") pod \"nova-api-db-create-lhdf9\" (UID: \"efe43680-5772-43bd-9ca1-4cb0245cae49\") " pod="openstack/nova-api-db-create-lhdf9" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.495657 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnqfq\" (UniqueName: \"kubernetes.io/projected/efe43680-5772-43bd-9ca1-4cb0245cae49-kube-api-access-nnqfq\") pod \"nova-api-db-create-lhdf9\" (UID: \"efe43680-5772-43bd-9ca1-4cb0245cae49\") " pod="openstack/nova-api-db-create-lhdf9" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.574228 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/956cce89-a20b-448a-9cf2-16b3ddcafe10-operator-scripts\") pod \"nova-cell1-db-create-4plp4\" (UID: \"956cce89-a20b-448a-9cf2-16b3ddcafe10\") " pod="openstack/nova-cell1-db-create-4plp4" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.575029 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/427479ca-18b6-4580-a4ef-f85bb4071c88-operator-scripts\") pod \"nova-api-4656-account-create-update-jztlz\" (UID: \"427479ca-18b6-4580-a4ef-f85bb4071c88\") " pod="openstack/nova-api-4656-account-create-update-jztlz" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.575228 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtnq5\" (UniqueName: \"kubernetes.io/projected/1488e9fc-9b52-4515-8fb3-980469c83ae8-kube-api-access-wtnq5\") pod \"nova-cell0-db-create-q6drx\" (UID: \"1488e9fc-9b52-4515-8fb3-980469c83ae8\") " pod="openstack/nova-cell0-db-create-q6drx" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.575387 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2qfd\" (UniqueName: \"kubernetes.io/projected/956cce89-a20b-448a-9cf2-16b3ddcafe10-kube-api-access-p2qfd\") pod \"nova-cell1-db-create-4plp4\" (UID: \"956cce89-a20b-448a-9cf2-16b3ddcafe10\") " pod="openstack/nova-cell1-db-create-4plp4" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.575544 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfr4m\" (UniqueName: \"kubernetes.io/projected/427479ca-18b6-4580-a4ef-f85bb4071c88-kube-api-access-hfr4m\") pod \"nova-api-4656-account-create-update-jztlz\" (UID: \"427479ca-18b6-4580-a4ef-f85bb4071c88\") " pod="openstack/nova-api-4656-account-create-update-jztlz" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.575680 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1488e9fc-9b52-4515-8fb3-980469c83ae8-operator-scripts\") pod \"nova-cell0-db-create-q6drx\" (UID: \"1488e9fc-9b52-4515-8fb3-980469c83ae8\") " pod="openstack/nova-cell0-db-create-q6drx" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.576579 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1488e9fc-9b52-4515-8fb3-980469c83ae8-operator-scripts\") pod \"nova-cell0-db-create-q6drx\" (UID: \"1488e9fc-9b52-4515-8fb3-980469c83ae8\") " pod="openstack/nova-cell0-db-create-q6drx" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.577555 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-44ce-account-create-update-54r5x"] Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.579949 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/427479ca-18b6-4580-a4ef-f85bb4071c88-operator-scripts\") pod \"nova-api-4656-account-create-update-jztlz\" (UID: \"427479ca-18b6-4580-a4ef-f85bb4071c88\") " pod="openstack/nova-api-4656-account-create-update-jztlz" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.581790 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-44ce-account-create-update-54r5x" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.584234 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.602843 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-44ce-account-create-update-54r5x"] Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.612821 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtnq5\" (UniqueName: \"kubernetes.io/projected/1488e9fc-9b52-4515-8fb3-980469c83ae8-kube-api-access-wtnq5\") pod \"nova-cell0-db-create-q6drx\" (UID: \"1488e9fc-9b52-4515-8fb3-980469c83ae8\") " pod="openstack/nova-cell0-db-create-q6drx" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.621505 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfr4m\" (UniqueName: \"kubernetes.io/projected/427479ca-18b6-4580-a4ef-f85bb4071c88-kube-api-access-hfr4m\") pod \"nova-api-4656-account-create-update-jztlz\" (UID: \"427479ca-18b6-4580-a4ef-f85bb4071c88\") " pod="openstack/nova-api-4656-account-create-update-jztlz" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.663723 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4656-account-create-update-jztlz" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.678300 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-bcd7-account-create-update-59qnr"] Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.681194 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bcd7-account-create-update-59qnr" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.683139 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2qfd\" (UniqueName: \"kubernetes.io/projected/956cce89-a20b-448a-9cf2-16b3ddcafe10-kube-api-access-p2qfd\") pod \"nova-cell1-db-create-4plp4\" (UID: \"956cce89-a20b-448a-9cf2-16b3ddcafe10\") " pod="openstack/nova-cell1-db-create-4plp4" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.683411 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/956cce89-a20b-448a-9cf2-16b3ddcafe10-operator-scripts\") pod \"nova-cell1-db-create-4plp4\" (UID: \"956cce89-a20b-448a-9cf2-16b3ddcafe10\") " pod="openstack/nova-cell1-db-create-4plp4" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.684423 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.687230 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/956cce89-a20b-448a-9cf2-16b3ddcafe10-operator-scripts\") pod \"nova-cell1-db-create-4plp4\" (UID: \"956cce89-a20b-448a-9cf2-16b3ddcafe10\") " pod="openstack/nova-cell1-db-create-4plp4" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.706609 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-bcd7-account-create-update-59qnr"] Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.715472 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2qfd\" (UniqueName: \"kubernetes.io/projected/956cce89-a20b-448a-9cf2-16b3ddcafe10-kube-api-access-p2qfd\") pod \"nova-cell1-db-create-4plp4\" (UID: \"956cce89-a20b-448a-9cf2-16b3ddcafe10\") " pod="openstack/nova-cell1-db-create-4plp4" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.779353 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4plp4" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.785699 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p5hj\" (UniqueName: \"kubernetes.io/projected/87dfa09a-c8bd-4861-b45f-7574d8295fa1-kube-api-access-6p5hj\") pod \"nova-cell0-44ce-account-create-update-54r5x\" (UID: \"87dfa09a-c8bd-4861-b45f-7574d8295fa1\") " pod="openstack/nova-cell0-44ce-account-create-update-54r5x" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.785796 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9da576af-ae63-447b-acaf-a6a7bfe96ddb-operator-scripts\") pod \"nova-cell1-bcd7-account-create-update-59qnr\" (UID: \"9da576af-ae63-447b-acaf-a6a7bfe96ddb\") " pod="openstack/nova-cell1-bcd7-account-create-update-59qnr" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.785917 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv4rk\" (UniqueName: \"kubernetes.io/projected/9da576af-ae63-447b-acaf-a6a7bfe96ddb-kube-api-access-pv4rk\") pod \"nova-cell1-bcd7-account-create-update-59qnr\" (UID: \"9da576af-ae63-447b-acaf-a6a7bfe96ddb\") " pod="openstack/nova-cell1-bcd7-account-create-update-59qnr" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.786045 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87dfa09a-c8bd-4861-b45f-7574d8295fa1-operator-scripts\") pod \"nova-cell0-44ce-account-create-update-54r5x\" (UID: \"87dfa09a-c8bd-4861-b45f-7574d8295fa1\") " pod="openstack/nova-cell0-44ce-account-create-update-54r5x" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.792796 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lhdf9" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.887770 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv4rk\" (UniqueName: \"kubernetes.io/projected/9da576af-ae63-447b-acaf-a6a7bfe96ddb-kube-api-access-pv4rk\") pod \"nova-cell1-bcd7-account-create-update-59qnr\" (UID: \"9da576af-ae63-447b-acaf-a6a7bfe96ddb\") " pod="openstack/nova-cell1-bcd7-account-create-update-59qnr" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.887827 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87dfa09a-c8bd-4861-b45f-7574d8295fa1-operator-scripts\") pod \"nova-cell0-44ce-account-create-update-54r5x\" (UID: \"87dfa09a-c8bd-4861-b45f-7574d8295fa1\") " pod="openstack/nova-cell0-44ce-account-create-update-54r5x" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.887902 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p5hj\" (UniqueName: \"kubernetes.io/projected/87dfa09a-c8bd-4861-b45f-7574d8295fa1-kube-api-access-6p5hj\") pod \"nova-cell0-44ce-account-create-update-54r5x\" (UID: \"87dfa09a-c8bd-4861-b45f-7574d8295fa1\") " pod="openstack/nova-cell0-44ce-account-create-update-54r5x" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.887948 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9da576af-ae63-447b-acaf-a6a7bfe96ddb-operator-scripts\") pod \"nova-cell1-bcd7-account-create-update-59qnr\" (UID: \"9da576af-ae63-447b-acaf-a6a7bfe96ddb\") " pod="openstack/nova-cell1-bcd7-account-create-update-59qnr" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.888693 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9da576af-ae63-447b-acaf-a6a7bfe96ddb-operator-scripts\") pod \"nova-cell1-bcd7-account-create-update-59qnr\" (UID: \"9da576af-ae63-447b-acaf-a6a7bfe96ddb\") " pod="openstack/nova-cell1-bcd7-account-create-update-59qnr" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.889897 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87dfa09a-c8bd-4861-b45f-7574d8295fa1-operator-scripts\") pod \"nova-cell0-44ce-account-create-update-54r5x\" (UID: \"87dfa09a-c8bd-4861-b45f-7574d8295fa1\") " pod="openstack/nova-cell0-44ce-account-create-update-54r5x" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.890554 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-q6drx" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.916851 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p5hj\" (UniqueName: \"kubernetes.io/projected/87dfa09a-c8bd-4861-b45f-7574d8295fa1-kube-api-access-6p5hj\") pod \"nova-cell0-44ce-account-create-update-54r5x\" (UID: \"87dfa09a-c8bd-4861-b45f-7574d8295fa1\") " pod="openstack/nova-cell0-44ce-account-create-update-54r5x" Dec 16 13:08:46 crc kubenswrapper[4757]: I1216 13:08:46.916991 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv4rk\" (UniqueName: \"kubernetes.io/projected/9da576af-ae63-447b-acaf-a6a7bfe96ddb-kube-api-access-pv4rk\") pod \"nova-cell1-bcd7-account-create-update-59qnr\" (UID: \"9da576af-ae63-447b-acaf-a6a7bfe96ddb\") " pod="openstack/nova-cell1-bcd7-account-create-update-59qnr" Dec 16 13:08:47 crc kubenswrapper[4757]: I1216 13:08:47.048047 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-44ce-account-create-update-54r5x" Dec 16 13:08:47 crc kubenswrapper[4757]: I1216 13:08:47.059790 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bcd7-account-create-update-59qnr" Dec 16 13:08:47 crc kubenswrapper[4757]: I1216 13:08:47.726493 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:08:47 crc kubenswrapper[4757]: I1216 13:08:47.726858 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerName="proxy-httpd" containerID="cri-o://12223a0b7ba10d366b1b7eed777f9063629bbd4d6fc3eb3f90643d1c745b2103" gracePeriod=30 Dec 16 13:08:47 crc kubenswrapper[4757]: I1216 13:08:47.727090 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerName="sg-core" containerID="cri-o://d728a1c8748643f0317fc1a8b2031e352883effaf8b7604f5eda0cc1aa8a8ed4" gracePeriod=30 Dec 16 13:08:47 crc kubenswrapper[4757]: I1216 13:08:47.727164 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerName="ceilometer-notification-agent" containerID="cri-o://babaa5b68cae83e0260090338c0609eea17694a25b5ffba7ea5f1fc381da24dd" gracePeriod=30 Dec 16 13:08:47 crc kubenswrapper[4757]: I1216 13:08:47.729764 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerName="ceilometer-central-agent" containerID="cri-o://af86fd7066c4336b80c6968c3fc8b3201c18d818a5c052797bf3272f6b25c492" gracePeriod=30 Dec 16 13:08:47 crc kubenswrapper[4757]: I1216 13:08:47.996607 4757 generic.go:334] "Generic (PLEG): container finished" podID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerID="d728a1c8748643f0317fc1a8b2031e352883effaf8b7604f5eda0cc1aa8a8ed4" exitCode=2 Dec 16 13:08:47 crc kubenswrapper[4757]: I1216 13:08:47.996717 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d90963c3-d526-4ee8-a945-9c8cb8868a9a","Type":"ContainerDied","Data":"d728a1c8748643f0317fc1a8b2031e352883effaf8b7604f5eda0cc1aa8a8ed4"} Dec 16 13:08:48 crc kubenswrapper[4757]: I1216 13:08:48.324983 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7f64c6bbf7-pnthz" Dec 16 13:08:48 crc kubenswrapper[4757]: I1216 13:08:48.432385 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9d8647f88-tzcvz"] Dec 16 13:08:48 crc kubenswrapper[4757]: I1216 13:08:48.432788 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-9d8647f88-tzcvz" podUID="7702799e-4dc6-4a4d-b479-59cae8163e3c" containerName="neutron-api" containerID="cri-o://a25c39c18ad1ad4f6143348071a05dddb88a90f7dd3e1e455dd0fa66b19a0b46" gracePeriod=30 Dec 16 13:08:48 crc kubenswrapper[4757]: I1216 13:08:48.436818 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-9d8647f88-tzcvz" podUID="7702799e-4dc6-4a4d-b479-59cae8163e3c" containerName="neutron-httpd" containerID="cri-o://af24427995b417ea48bb519238512682f9cdda556b4a976136f4fb16060e66ba" gracePeriod=30 Dec 16 13:08:49 crc kubenswrapper[4757]: I1216 13:08:49.009219 4757 generic.go:334] "Generic (PLEG): container finished" podID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerID="12223a0b7ba10d366b1b7eed777f9063629bbd4d6fc3eb3f90643d1c745b2103" exitCode=0 Dec 16 13:08:49 crc kubenswrapper[4757]: I1216 13:08:49.009275 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d90963c3-d526-4ee8-a945-9c8cb8868a9a","Type":"ContainerDied","Data":"12223a0b7ba10d366b1b7eed777f9063629bbd4d6fc3eb3f90643d1c745b2103"} Dec 16 13:08:50 crc kubenswrapper[4757]: I1216 13:08:50.021720 4757 generic.go:334] "Generic (PLEG): container finished" podID="7702799e-4dc6-4a4d-b479-59cae8163e3c" containerID="af24427995b417ea48bb519238512682f9cdda556b4a976136f4fb16060e66ba" exitCode=0 Dec 16 13:08:50 crc kubenswrapper[4757]: I1216 13:08:50.021775 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9d8647f88-tzcvz" event={"ID":"7702799e-4dc6-4a4d-b479-59cae8163e3c","Type":"ContainerDied","Data":"af24427995b417ea48bb519238512682f9cdda556b4a976136f4fb16060e66ba"} Dec 16 13:08:50 crc kubenswrapper[4757]: I1216 13:08:50.025518 4757 generic.go:334] "Generic (PLEG): container finished" podID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerID="babaa5b68cae83e0260090338c0609eea17694a25b5ffba7ea5f1fc381da24dd" exitCode=0 Dec 16 13:08:50 crc kubenswrapper[4757]: I1216 13:08:50.025540 4757 generic.go:334] "Generic (PLEG): container finished" podID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerID="af86fd7066c4336b80c6968c3fc8b3201c18d818a5c052797bf3272f6b25c492" exitCode=0 Dec 16 13:08:50 crc kubenswrapper[4757]: I1216 13:08:50.025557 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d90963c3-d526-4ee8-a945-9c8cb8868a9a","Type":"ContainerDied","Data":"babaa5b68cae83e0260090338c0609eea17694a25b5ffba7ea5f1fc381da24dd"} Dec 16 13:08:50 crc kubenswrapper[4757]: I1216 13:08:50.025576 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d90963c3-d526-4ee8-a945-9c8cb8868a9a","Type":"ContainerDied","Data":"af86fd7066c4336b80c6968c3fc8b3201c18d818a5c052797bf3272f6b25c492"} Dec 16 13:08:51 crc kubenswrapper[4757]: E1216 13:08:51.618741 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Dec 16 13:08:51 crc kubenswrapper[4757]: E1216 13:08:51.619723 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n645h5h4h55ch5c8h66h669h665h576h598h68h54h687h664h558h56fhdchc9h687h557h65fh5cch64fh564h5f8h646h88h97hbfh546h577h75q,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s65jt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(891886a7-6bbd-48b7-8460-a1467bae862a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:08:51 crc kubenswrapper[4757]: E1216 13:08:51.621240 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="891886a7-6bbd-48b7-8460-a1467bae862a" Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.078939 4757 generic.go:334] "Generic (PLEG): container finished" podID="399f2693-64b1-4958-ad75-49c45b448ed5" containerID="25ab2be0f535088a98cd5974a570660c2b1ab7f032761874bdf1659a40210f03" exitCode=137 Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.080870 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75ccc7d896-jmrk9" event={"ID":"399f2693-64b1-4958-ad75-49c45b448ed5","Type":"ContainerDied","Data":"25ab2be0f535088a98cd5974a570660c2b1ab7f032761874bdf1659a40210f03"} Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.099071 4757 generic.go:334] "Generic (PLEG): container finished" podID="65337bd1-c674-4817-91c2-ad150639205c" containerID="357eee533356136a47d50ddfe4b10cb06996ef66793b34da2123f1bd22018055" exitCode=137 Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.099226 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d66ddf65b-lmltr" event={"ID":"65337bd1-c674-4817-91c2-ad150639205c","Type":"ContainerDied","Data":"357eee533356136a47d50ddfe4b10cb06996ef66793b34da2123f1bd22018055"} Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.117898 4757 generic.go:334] "Generic (PLEG): container finished" podID="7702799e-4dc6-4a4d-b479-59cae8163e3c" containerID="a25c39c18ad1ad4f6143348071a05dddb88a90f7dd3e1e455dd0fa66b19a0b46" exitCode=0 Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.118182 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9d8647f88-tzcvz" event={"ID":"7702799e-4dc6-4a4d-b479-59cae8163e3c","Type":"ContainerDied","Data":"a25c39c18ad1ad4f6143348071a05dddb88a90f7dd3e1e455dd0fa66b19a0b46"} Dec 16 13:08:52 crc kubenswrapper[4757]: E1216 13:08:52.144917 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="891886a7-6bbd-48b7-8460-a1467bae862a" Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.151872 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.307686 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4plp4"] Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.312939 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28lmd\" (UniqueName: \"kubernetes.io/projected/d90963c3-d526-4ee8-a945-9c8cb8868a9a-kube-api-access-28lmd\") pod \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.313021 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d90963c3-d526-4ee8-a945-9c8cb8868a9a-run-httpd\") pod \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.313100 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-combined-ca-bundle\") pod \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.313187 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-scripts\") pod \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.313240 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d90963c3-d526-4ee8-a945-9c8cb8868a9a-log-httpd\") pod \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.313285 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-sg-core-conf-yaml\") pod \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.313411 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-config-data\") pod \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\" (UID: \"d90963c3-d526-4ee8-a945-9c8cb8868a9a\") " Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.313586 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d90963c3-d526-4ee8-a945-9c8cb8868a9a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d90963c3-d526-4ee8-a945-9c8cb8868a9a" (UID: "d90963c3-d526-4ee8-a945-9c8cb8868a9a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.314091 4757 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d90963c3-d526-4ee8-a945-9c8cb8868a9a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.319130 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d90963c3-d526-4ee8-a945-9c8cb8868a9a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d90963c3-d526-4ee8-a945-9c8cb8868a9a" (UID: "d90963c3-d526-4ee8-a945-9c8cb8868a9a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.325382 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d90963c3-d526-4ee8-a945-9c8cb8868a9a-kube-api-access-28lmd" (OuterVolumeSpecName: "kube-api-access-28lmd") pod "d90963c3-d526-4ee8-a945-9c8cb8868a9a" (UID: "d90963c3-d526-4ee8-a945-9c8cb8868a9a"). InnerVolumeSpecName "kube-api-access-28lmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.325859 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-scripts" (OuterVolumeSpecName: "scripts") pod "d90963c3-d526-4ee8-a945-9c8cb8868a9a" (UID: "d90963c3-d526-4ee8-a945-9c8cb8868a9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.416757 4757 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d90963c3-d526-4ee8-a945-9c8cb8868a9a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.416793 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28lmd\" (UniqueName: \"kubernetes.io/projected/d90963c3-d526-4ee8-a945-9c8cb8868a9a-kube-api-access-28lmd\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.416810 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.585183 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d90963c3-d526-4ee8-a945-9c8cb8868a9a" (UID: "d90963c3-d526-4ee8-a945-9c8cb8868a9a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.612486 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 13:08:52 crc kubenswrapper[4757]: W1216 13:08:52.615632 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9065cfba_e560_471d_bb64_e20502e5b5d6.slice/crio-78dd5122062ac8111cbeff8dfd2c9622316f66ab8112872ed41c3ab90e769b40 WatchSource:0}: Error finding container 78dd5122062ac8111cbeff8dfd2c9622316f66ab8112872ed41c3ab90e769b40: Status 404 returned error can't find the container with id 78dd5122062ac8111cbeff8dfd2c9622316f66ab8112872ed41c3ab90e769b40 Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.621380 4757 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.713305 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d90963c3-d526-4ee8-a945-9c8cb8868a9a" (UID: "d90963c3-d526-4ee8-a945-9c8cb8868a9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.729254 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.753736 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-config-data" (OuterVolumeSpecName: "config-data") pod "d90963c3-d526-4ee8-a945-9c8cb8868a9a" (UID: "d90963c3-d526-4ee8-a945-9c8cb8868a9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:52 crc kubenswrapper[4757]: W1216 13:08:52.793254 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod427479ca_18b6_4580_a4ef_f85bb4071c88.slice/crio-b19b7c17e357df5d8479236965f548ce737ee98b8de683627c5f45f1e70db8a1 WatchSource:0}: Error finding container b19b7c17e357df5d8479236965f548ce737ee98b8de683627c5f45f1e70db8a1: Status 404 returned error can't find the container with id b19b7c17e357df5d8479236965f548ce737ee98b8de683627c5f45f1e70db8a1 Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.826510 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4656-account-create-update-jztlz"] Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.834260 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90963c3-d526-4ee8-a945-9c8cb8868a9a-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:52 crc kubenswrapper[4757]: I1216 13:08:52.949442 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.049556 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-httpd-config\") pod \"7702799e-4dc6-4a4d-b479-59cae8163e3c\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.049607 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msp9c\" (UniqueName: \"kubernetes.io/projected/7702799e-4dc6-4a4d-b479-59cae8163e3c-kube-api-access-msp9c\") pod \"7702799e-4dc6-4a4d-b479-59cae8163e3c\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.049757 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-ovndb-tls-certs\") pod \"7702799e-4dc6-4a4d-b479-59cae8163e3c\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.049880 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-combined-ca-bundle\") pod \"7702799e-4dc6-4a4d-b479-59cae8163e3c\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.049913 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-config\") pod \"7702799e-4dc6-4a4d-b479-59cae8163e3c\" (UID: \"7702799e-4dc6-4a4d-b479-59cae8163e3c\") " Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.062527 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7702799e-4dc6-4a4d-b479-59cae8163e3c" (UID: "7702799e-4dc6-4a4d-b479-59cae8163e3c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.065700 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7702799e-4dc6-4a4d-b479-59cae8163e3c-kube-api-access-msp9c" (OuterVolumeSpecName: "kube-api-access-msp9c") pod "7702799e-4dc6-4a4d-b479-59cae8163e3c" (UID: "7702799e-4dc6-4a4d-b479-59cae8163e3c"). InnerVolumeSpecName "kube-api-access-msp9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:53 crc kubenswrapper[4757]: W1216 13:08:53.079355 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38a8b3dc_7995_4851_96db_0fb6749669b9.slice/crio-9a8bc7726f99f3889d7b064a06c5a1b7b4c65bae8305b23dbb45c4040c5c7371 WatchSource:0}: Error finding container 9a8bc7726f99f3889d7b064a06c5a1b7b4c65bae8305b23dbb45c4040c5c7371: Status 404 returned error can't find the container with id 9a8bc7726f99f3889d7b064a06c5a1b7b4c65bae8305b23dbb45c4040c5c7371 Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.083599 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-544dfc5bc-8q666"] Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.094091 4757 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podd708ede9-7a1e-4baa-9c7a-21bb0007c6ee"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podd708ede9-7a1e-4baa-9c7a-21bb0007c6ee] : Timed out while waiting for systemd to remove kubepods-besteffort-podd708ede9_7a1e_4baa_9c7a_21bb0007c6ee.slice" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.161672 4757 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.161717 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msp9c\" (UniqueName: \"kubernetes.io/projected/7702799e-4dc6-4a4d-b479-59cae8163e3c-kube-api-access-msp9c\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.187816 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4plp4" event={"ID":"956cce89-a20b-448a-9cf2-16b3ddcafe10","Type":"ContainerStarted","Data":"8f051f56e4baa5e1cfb82dbd4d7f6c70a9a250f0460f2b330b0f30c6ca589ac3"} Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.187868 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4plp4" event={"ID":"956cce89-a20b-448a-9cf2-16b3ddcafe10","Type":"ContainerStarted","Data":"587af3208ad9fe0fee91a896065e54bc4d12aeea25f9f494be48b7ec38fba56d"} Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.199950 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9d8647f88-tzcvz" event={"ID":"7702799e-4dc6-4a4d-b479-59cae8163e3c","Type":"ContainerDied","Data":"7f6fa46392ae864a1ac70b717574ffeb1219b90ef81f02999a71e33c58e04e1e"} Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.200036 4757 scope.go:117] "RemoveContainer" containerID="af24427995b417ea48bb519238512682f9cdda556b4a976136f4fb16060e66ba" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.200204 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9d8647f88-tzcvz" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.220936 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4656-account-create-update-jztlz" event={"ID":"427479ca-18b6-4580-a4ef-f85bb4071c88","Type":"ContainerStarted","Data":"b19b7c17e357df5d8479236965f548ce737ee98b8de683627c5f45f1e70db8a1"} Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.233780 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d90963c3-d526-4ee8-a945-9c8cb8868a9a","Type":"ContainerDied","Data":"c81d1e9b1e848b506ee4ee5a6675a1ccc27f74d8cf2dae592baa1d780ee7820b"} Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.233875 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.239789 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-4plp4" podStartSLOduration=7.23976991 podStartE2EDuration="7.23976991s" podCreationTimestamp="2025-12-16 13:08:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:08:53.217075692 +0000 UTC m=+1318.644819498" watchObservedRunningTime="2025-12-16 13:08:53.23976991 +0000 UTC m=+1318.667513706" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.267535 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75ccc7d896-jmrk9" event={"ID":"399f2693-64b1-4958-ad75-49c45b448ed5","Type":"ContainerStarted","Data":"bc30dc5c1ec1bde48a9125e46de96e8a257135a5fb2342aa9590cfa8754f8773"} Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.306966 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d66ddf65b-lmltr" event={"ID":"65337bd1-c674-4817-91c2-ad150639205c","Type":"ContainerStarted","Data":"8f67125ddbcaa77814e69db0c0e1c32f2f3b513d08902bb37e5163bedc8c2aef"} Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.310167 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-config" (OuterVolumeSpecName: "config") pod "7702799e-4dc6-4a4d-b479-59cae8163e3c" (UID: "7702799e-4dc6-4a4d-b479-59cae8163e3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.352685 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-544dfc5bc-8q666" event={"ID":"38a8b3dc-7995-4851-96db-0fb6749669b9","Type":"ContainerStarted","Data":"9a8bc7726f99f3889d7b064a06c5a1b7b4c65bae8305b23dbb45c4040c5c7371"} Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.357619 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-lhdf9"] Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.371217 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9065cfba-e560-471d-bb64-e20502e5b5d6","Type":"ContainerStarted","Data":"78dd5122062ac8111cbeff8dfd2c9622316f66ab8112872ed41c3ab90e769b40"} Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.373137 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.378382 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.409227 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7702799e-4dc6-4a4d-b479-59cae8163e3c" (UID: "7702799e-4dc6-4a4d-b479-59cae8163e3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.409330 4757 scope.go:117] "RemoveContainer" containerID="a25c39c18ad1ad4f6143348071a05dddb88a90f7dd3e1e455dd0fa66b19a0b46" Dec 16 13:08:53 crc kubenswrapper[4757]: W1216 13:08:53.412159 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87dfa09a_c8bd_4861_b45f_7574d8295fa1.slice/crio-e6556068d8ab303180972cc5e82ba55075094a67f2d6bd1b77d336ae9a173d01 WatchSource:0}: Error finding container e6556068d8ab303180972cc5e82ba55075094a67f2d6bd1b77d336ae9a173d01: Status 404 returned error can't find the container with id e6556068d8ab303180972cc5e82ba55075094a67f2d6bd1b77d336ae9a173d01 Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.432978 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.461352 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-bcd7-account-create-update-59qnr"] Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.470969 4757 scope.go:117] "RemoveContainer" containerID="12223a0b7ba10d366b1b7eed777f9063629bbd4d6fc3eb3f90643d1c745b2103" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.477132 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.481887 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-q6drx"] Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.501130 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-44ce-account-create-update-54r5x"] Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.510130 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7702799e-4dc6-4a4d-b479-59cae8163e3c" (UID: "7702799e-4dc6-4a4d-b479-59cae8163e3c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.528226 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:08:53 crc kubenswrapper[4757]: E1216 13:08:53.528799 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerName="sg-core" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.528810 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerName="sg-core" Dec 16 13:08:53 crc kubenswrapper[4757]: E1216 13:08:53.528821 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7702799e-4dc6-4a4d-b479-59cae8163e3c" containerName="neutron-api" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.528827 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="7702799e-4dc6-4a4d-b479-59cae8163e3c" containerName="neutron-api" Dec 16 13:08:53 crc kubenswrapper[4757]: E1216 13:08:53.528841 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerName="ceilometer-notification-agent" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.528847 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerName="ceilometer-notification-agent" Dec 16 13:08:53 crc kubenswrapper[4757]: E1216 13:08:53.528869 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7702799e-4dc6-4a4d-b479-59cae8163e3c" containerName="neutron-httpd" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.528875 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="7702799e-4dc6-4a4d-b479-59cae8163e3c" containerName="neutron-httpd" Dec 16 13:08:53 crc kubenswrapper[4757]: E1216 13:08:53.528888 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerName="ceilometer-central-agent" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.528894 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerName="ceilometer-central-agent" Dec 16 13:08:53 crc kubenswrapper[4757]: E1216 13:08:53.528915 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerName="proxy-httpd" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.528920 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerName="proxy-httpd" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.529097 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="7702799e-4dc6-4a4d-b479-59cae8163e3c" containerName="neutron-httpd" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.529112 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="7702799e-4dc6-4a4d-b479-59cae8163e3c" containerName="neutron-api" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.529121 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerName="proxy-httpd" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.529130 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerName="sg-core" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.529141 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerName="ceilometer-central-agent" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.529151 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" containerName="ceilometer-notification-agent" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.531619 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.533043 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.536470 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.536811 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.585784 4757 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7702799e-4dc6-4a4d-b479-59cae8163e3c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.689862 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-scripts\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.690244 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-run-httpd\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.690314 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.690374 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.690500 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-config-data\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.690568 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb96l\" (UniqueName: \"kubernetes.io/projected/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-kube-api-access-zb96l\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.690794 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-log-httpd\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.793854 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-run-httpd\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.793919 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.793949 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.794000 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-config-data\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.794088 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb96l\" (UniqueName: \"kubernetes.io/projected/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-kube-api-access-zb96l\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.794118 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-log-httpd\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.794152 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-scripts\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.795745 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-run-httpd\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.796142 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-log-httpd\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.799294 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.803569 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.804600 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-scripts\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.813659 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-config-data\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.820405 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb96l\" (UniqueName: \"kubernetes.io/projected/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-kube-api-access-zb96l\") pod \"ceilometer-0\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " pod="openstack/ceilometer-0" Dec 16 13:08:53 crc kubenswrapper[4757]: I1216 13:08:53.925087 4757 scope.go:117] "RemoveContainer" containerID="d728a1c8748643f0317fc1a8b2031e352883effaf8b7604f5eda0cc1aa8a8ed4" Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.025697 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.036382 4757 scope.go:117] "RemoveContainer" containerID="babaa5b68cae83e0260090338c0609eea17694a25b5ffba7ea5f1fc381da24dd" Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.053624 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9d8647f88-tzcvz"] Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.069984 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9d8647f88-tzcvz"] Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.150989 4757 scope.go:117] "RemoveContainer" containerID="af86fd7066c4336b80c6968c3fc8b3201c18d818a5c052797bf3272f6b25c492" Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.421433 4757 generic.go:334] "Generic (PLEG): container finished" podID="427479ca-18b6-4580-a4ef-f85bb4071c88" containerID="16dd2217e5ae1a2315f7849fcd38180ee50184b219ec945589504ceef408ad24" exitCode=0 Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.421786 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4656-account-create-update-jztlz" event={"ID":"427479ca-18b6-4580-a4ef-f85bb4071c88","Type":"ContainerDied","Data":"16dd2217e5ae1a2315f7849fcd38180ee50184b219ec945589504ceef408ad24"} Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.423875 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-544dfc5bc-8q666" event={"ID":"38a8b3dc-7995-4851-96db-0fb6749669b9","Type":"ContainerStarted","Data":"a48c0ba53796c2317fffb104bdf475ade67491309b75698a08f355b357cd98dd"} Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.424862 4757 generic.go:334] "Generic (PLEG): container finished" podID="956cce89-a20b-448a-9cf2-16b3ddcafe10" containerID="8f051f56e4baa5e1cfb82dbd4d7f6c70a9a250f0460f2b330b0f30c6ca589ac3" exitCode=0 Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.424910 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4plp4" event={"ID":"956cce89-a20b-448a-9cf2-16b3ddcafe10","Type":"ContainerDied","Data":"8f051f56e4baa5e1cfb82dbd4d7f6c70a9a250f0460f2b330b0f30c6ca589ac3"} Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.471722 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-q6drx" event={"ID":"1488e9fc-9b52-4515-8fb3-980469c83ae8","Type":"ContainerStarted","Data":"2fbb1aa8105f409a12e6a82644af4e5dd1ad00a5425a67aefcfb6debc0c8e154"} Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.533604 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-44ce-account-create-update-54r5x" event={"ID":"87dfa09a-c8bd-4861-b45f-7574d8295fa1","Type":"ContainerStarted","Data":"3a9ae0ea5a868d4ac291a5f13cd7caa7114786cb1777cd6c48f7c56766b75279"} Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.533657 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-44ce-account-create-update-54r5x" event={"ID":"87dfa09a-c8bd-4861-b45f-7574d8295fa1","Type":"ContainerStarted","Data":"e6556068d8ab303180972cc5e82ba55075094a67f2d6bd1b77d336ae9a173d01"} Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.594893 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lhdf9" event={"ID":"efe43680-5772-43bd-9ca1-4cb0245cae49","Type":"ContainerStarted","Data":"4144e02fa39fd42cb957289095adc59adb01dd6b67c1c6acb9898c1c3cd3026f"} Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.594957 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lhdf9" event={"ID":"efe43680-5772-43bd-9ca1-4cb0245cae49","Type":"ContainerStarted","Data":"72599566a6b6ce38b82dd3fc8f160710e65ed0366de498553792a3cfd56d3741"} Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.646092 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-44ce-account-create-update-54r5x" podStartSLOduration=8.646070942 podStartE2EDuration="8.646070942s" podCreationTimestamp="2025-12-16 13:08:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:08:54.595359108 +0000 UTC m=+1320.023102904" watchObservedRunningTime="2025-12-16 13:08:54.646070942 +0000 UTC m=+1320.073814738" Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.650811 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9065cfba-e560-471d-bb64-e20502e5b5d6","Type":"ContainerStarted","Data":"3baea06716d90a282c18a650104a6de84e11e5421d9bb4a2ab9218b5de838845"} Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.665373 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bcd7-account-create-update-59qnr" event={"ID":"9da576af-ae63-447b-acaf-a6a7bfe96ddb","Type":"ContainerStarted","Data":"1dc304aad474c8858594e0ddc43429ed0e67764892182e8ac4de358acd80179a"} Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.665423 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bcd7-account-create-update-59qnr" event={"ID":"9da576af-ae63-447b-acaf-a6a7bfe96ddb","Type":"ContainerStarted","Data":"7c7bb96bcd89a611f035e2e774ec7e7bef75ca88b6519031e7d26a4cd29afcda"} Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.717910 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-lhdf9" podStartSLOduration=8.717887043 podStartE2EDuration="8.717887043s" podCreationTimestamp="2025-12-16 13:08:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:08:54.64354879 +0000 UTC m=+1320.071292586" watchObservedRunningTime="2025-12-16 13:08:54.717887043 +0000 UTC m=+1320.145630839" Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.769370 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-bcd7-account-create-update-59qnr" podStartSLOduration=8.769344166 podStartE2EDuration="8.769344166s" podCreationTimestamp="2025-12-16 13:08:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:08:54.714712066 +0000 UTC m=+1320.142455862" watchObservedRunningTime="2025-12-16 13:08:54.769344166 +0000 UTC m=+1320.197087962" Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.987196 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7702799e-4dc6-4a4d-b479-59cae8163e3c" path="/var/lib/kubelet/pods/7702799e-4dc6-4a4d-b479-59cae8163e3c/volumes" Dec 16 13:08:54 crc kubenswrapper[4757]: I1216 13:08:54.987924 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d90963c3-d526-4ee8-a945-9c8cb8868a9a" path="/var/lib/kubelet/pods/d90963c3-d526-4ee8-a945-9c8cb8868a9a/volumes" Dec 16 13:08:55 crc kubenswrapper[4757]: I1216 13:08:55.084255 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:08:55 crc kubenswrapper[4757]: W1216 13:08:55.116075 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95be4d9a_c6a3_47ad_a63b_5437aff2cef1.slice/crio-5fb8ca3a88e94890d022ecf48677dc11e36545c02a3ac70bf92ecd07d5808ccb WatchSource:0}: Error finding container 5fb8ca3a88e94890d022ecf48677dc11e36545c02a3ac70bf92ecd07d5808ccb: Status 404 returned error can't find the container with id 5fb8ca3a88e94890d022ecf48677dc11e36545c02a3ac70bf92ecd07d5808ccb Dec 16 13:08:55 crc kubenswrapper[4757]: I1216 13:08:55.137449 4757 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 13:08:55 crc kubenswrapper[4757]: I1216 13:08:55.676046 4757 generic.go:334] "Generic (PLEG): container finished" podID="1488e9fc-9b52-4515-8fb3-980469c83ae8" containerID="ada0d0eef689644fe20216a8d2fabb8480020fef64c026e5ab8fc518b1e979dc" exitCode=0 Dec 16 13:08:55 crc kubenswrapper[4757]: I1216 13:08:55.676139 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-q6drx" event={"ID":"1488e9fc-9b52-4515-8fb3-980469c83ae8","Type":"ContainerDied","Data":"ada0d0eef689644fe20216a8d2fabb8480020fef64c026e5ab8fc518b1e979dc"} Dec 16 13:08:55 crc kubenswrapper[4757]: I1216 13:08:55.677681 4757 generic.go:334] "Generic (PLEG): container finished" podID="87dfa09a-c8bd-4861-b45f-7574d8295fa1" containerID="3a9ae0ea5a868d4ac291a5f13cd7caa7114786cb1777cd6c48f7c56766b75279" exitCode=0 Dec 16 13:08:55 crc kubenswrapper[4757]: I1216 13:08:55.677764 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-44ce-account-create-update-54r5x" event={"ID":"87dfa09a-c8bd-4861-b45f-7574d8295fa1","Type":"ContainerDied","Data":"3a9ae0ea5a868d4ac291a5f13cd7caa7114786cb1777cd6c48f7c56766b75279"} Dec 16 13:08:55 crc kubenswrapper[4757]: I1216 13:08:55.679312 4757 generic.go:334] "Generic (PLEG): container finished" podID="efe43680-5772-43bd-9ca1-4cb0245cae49" containerID="4144e02fa39fd42cb957289095adc59adb01dd6b67c1c6acb9898c1c3cd3026f" exitCode=0 Dec 16 13:08:55 crc kubenswrapper[4757]: I1216 13:08:55.679451 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lhdf9" event={"ID":"efe43680-5772-43bd-9ca1-4cb0245cae49","Type":"ContainerDied","Data":"4144e02fa39fd42cb957289095adc59adb01dd6b67c1c6acb9898c1c3cd3026f"} Dec 16 13:08:55 crc kubenswrapper[4757]: I1216 13:08:55.681400 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9065cfba-e560-471d-bb64-e20502e5b5d6","Type":"ContainerStarted","Data":"61dcc15a97097c146cce9391ff770ad65fa5636c88fffbbef265422038c5a8ab"} Dec 16 13:08:55 crc kubenswrapper[4757]: I1216 13:08:55.683527 4757 generic.go:334] "Generic (PLEG): container finished" podID="9da576af-ae63-447b-acaf-a6a7bfe96ddb" containerID="1dc304aad474c8858594e0ddc43429ed0e67764892182e8ac4de358acd80179a" exitCode=0 Dec 16 13:08:55 crc kubenswrapper[4757]: I1216 13:08:55.683588 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bcd7-account-create-update-59qnr" event={"ID":"9da576af-ae63-447b-acaf-a6a7bfe96ddb","Type":"ContainerDied","Data":"1dc304aad474c8858594e0ddc43429ed0e67764892182e8ac4de358acd80179a"} Dec 16 13:08:55 crc kubenswrapper[4757]: I1216 13:08:55.686085 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95be4d9a-c6a3-47ad-a63b-5437aff2cef1","Type":"ContainerStarted","Data":"5fb8ca3a88e94890d022ecf48677dc11e36545c02a3ac70bf92ecd07d5808ccb"} Dec 16 13:08:55 crc kubenswrapper[4757]: I1216 13:08:55.688907 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-544dfc5bc-8q666" event={"ID":"38a8b3dc-7995-4851-96db-0fb6749669b9","Type":"ContainerStarted","Data":"110e0671689f52f3741ace1b97a556fba54b169cda422b01d0abe2089632a3d0"} Dec 16 13:08:55 crc kubenswrapper[4757]: I1216 13:08:55.766317 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-544dfc5bc-8q666" podStartSLOduration=10.766300404999999 podStartE2EDuration="10.766300405s" podCreationTimestamp="2025-12-16 13:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:08:55.76563725 +0000 UTC m=+1321.193381046" watchObservedRunningTime="2025-12-16 13:08:55.766300405 +0000 UTC m=+1321.194044201" Dec 16 13:08:55 crc kubenswrapper[4757]: I1216 13:08:55.822057 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=14.822028893 podStartE2EDuration="14.822028893s" podCreationTimestamp="2025-12-16 13:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:08:55.813711129 +0000 UTC m=+1321.241454925" watchObservedRunningTime="2025-12-16 13:08:55.822028893 +0000 UTC m=+1321.249772689" Dec 16 13:08:55 crc kubenswrapper[4757]: I1216 13:08:55.958241 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:55 crc kubenswrapper[4757]: I1216 13:08:55.958362 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.364826 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4656-account-create-update-jztlz" Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.403833 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4plp4" Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.468162 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/427479ca-18b6-4580-a4ef-f85bb4071c88-operator-scripts\") pod \"427479ca-18b6-4580-a4ef-f85bb4071c88\" (UID: \"427479ca-18b6-4580-a4ef-f85bb4071c88\") " Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.468276 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/956cce89-a20b-448a-9cf2-16b3ddcafe10-operator-scripts\") pod \"956cce89-a20b-448a-9cf2-16b3ddcafe10\" (UID: \"956cce89-a20b-448a-9cf2-16b3ddcafe10\") " Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.468312 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2qfd\" (UniqueName: \"kubernetes.io/projected/956cce89-a20b-448a-9cf2-16b3ddcafe10-kube-api-access-p2qfd\") pod \"956cce89-a20b-448a-9cf2-16b3ddcafe10\" (UID: \"956cce89-a20b-448a-9cf2-16b3ddcafe10\") " Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.468337 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfr4m\" (UniqueName: \"kubernetes.io/projected/427479ca-18b6-4580-a4ef-f85bb4071c88-kube-api-access-hfr4m\") pod \"427479ca-18b6-4580-a4ef-f85bb4071c88\" (UID: \"427479ca-18b6-4580-a4ef-f85bb4071c88\") " Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.469058 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/427479ca-18b6-4580-a4ef-f85bb4071c88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "427479ca-18b6-4580-a4ef-f85bb4071c88" (UID: "427479ca-18b6-4580-a4ef-f85bb4071c88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.469153 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/956cce89-a20b-448a-9cf2-16b3ddcafe10-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "956cce89-a20b-448a-9cf2-16b3ddcafe10" (UID: "956cce89-a20b-448a-9cf2-16b3ddcafe10"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.475733 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/427479ca-18b6-4580-a4ef-f85bb4071c88-kube-api-access-hfr4m" (OuterVolumeSpecName: "kube-api-access-hfr4m") pod "427479ca-18b6-4580-a4ef-f85bb4071c88" (UID: "427479ca-18b6-4580-a4ef-f85bb4071c88"). InnerVolumeSpecName "kube-api-access-hfr4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.478261 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/956cce89-a20b-448a-9cf2-16b3ddcafe10-kube-api-access-p2qfd" (OuterVolumeSpecName: "kube-api-access-p2qfd") pod "956cce89-a20b-448a-9cf2-16b3ddcafe10" (UID: "956cce89-a20b-448a-9cf2-16b3ddcafe10"). InnerVolumeSpecName "kube-api-access-p2qfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.577218 4757 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/956cce89-a20b-448a-9cf2-16b3ddcafe10-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.577637 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2qfd\" (UniqueName: \"kubernetes.io/projected/956cce89-a20b-448a-9cf2-16b3ddcafe10-kube-api-access-p2qfd\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.577658 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfr4m\" (UniqueName: \"kubernetes.io/projected/427479ca-18b6-4580-a4ef-f85bb4071c88-kube-api-access-hfr4m\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.577671 4757 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/427479ca-18b6-4580-a4ef-f85bb4071c88-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.700068 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4656-account-create-update-jztlz" Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.700220 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4656-account-create-update-jztlz" event={"ID":"427479ca-18b6-4580-a4ef-f85bb4071c88","Type":"ContainerDied","Data":"b19b7c17e357df5d8479236965f548ce737ee98b8de683627c5f45f1e70db8a1"} Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.700370 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b19b7c17e357df5d8479236965f548ce737ee98b8de683627c5f45f1e70db8a1" Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.701927 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95be4d9a-c6a3-47ad-a63b-5437aff2cef1","Type":"ContainerStarted","Data":"406c0942de168d812fc472612ca39c3837c22654f7118c2e3032d6cebeabda20"} Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.703906 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4plp4" event={"ID":"956cce89-a20b-448a-9cf2-16b3ddcafe10","Type":"ContainerDied","Data":"587af3208ad9fe0fee91a896065e54bc4d12aeea25f9f494be48b7ec38fba56d"} Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.704257 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="587af3208ad9fe0fee91a896065e54bc4d12aeea25f9f494be48b7ec38fba56d" Dec 16 13:08:56 crc kubenswrapper[4757]: I1216 13:08:56.704030 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4plp4" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.368551 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bcd7-account-create-update-59qnr" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.508904 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pv4rk\" (UniqueName: \"kubernetes.io/projected/9da576af-ae63-447b-acaf-a6a7bfe96ddb-kube-api-access-pv4rk\") pod \"9da576af-ae63-447b-acaf-a6a7bfe96ddb\" (UID: \"9da576af-ae63-447b-acaf-a6a7bfe96ddb\") " Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.509067 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9da576af-ae63-447b-acaf-a6a7bfe96ddb-operator-scripts\") pod \"9da576af-ae63-447b-acaf-a6a7bfe96ddb\" (UID: \"9da576af-ae63-447b-acaf-a6a7bfe96ddb\") " Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.510042 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9da576af-ae63-447b-acaf-a6a7bfe96ddb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9da576af-ae63-447b-acaf-a6a7bfe96ddb" (UID: "9da576af-ae63-447b-acaf-a6a7bfe96ddb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.520335 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da576af-ae63-447b-acaf-a6a7bfe96ddb-kube-api-access-pv4rk" (OuterVolumeSpecName: "kube-api-access-pv4rk") pod "9da576af-ae63-447b-acaf-a6a7bfe96ddb" (UID: "9da576af-ae63-447b-acaf-a6a7bfe96ddb"). InnerVolumeSpecName "kube-api-access-pv4rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.612198 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv4rk\" (UniqueName: \"kubernetes.io/projected/9da576af-ae63-447b-acaf-a6a7bfe96ddb-kube-api-access-pv4rk\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.612594 4757 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9da576af-ae63-447b-acaf-a6a7bfe96ddb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.639491 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-44ce-account-create-update-54r5x" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.661208 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lhdf9" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.675809 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-q6drx" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.711486 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.714335 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p5hj\" (UniqueName: \"kubernetes.io/projected/87dfa09a-c8bd-4861-b45f-7574d8295fa1-kube-api-access-6p5hj\") pod \"87dfa09a-c8bd-4861-b45f-7574d8295fa1\" (UID: \"87dfa09a-c8bd-4861-b45f-7574d8295fa1\") " Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.714476 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87dfa09a-c8bd-4861-b45f-7574d8295fa1-operator-scripts\") pod \"87dfa09a-c8bd-4861-b45f-7574d8295fa1\" (UID: \"87dfa09a-c8bd-4861-b45f-7574d8295fa1\") " Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.723463 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87dfa09a-c8bd-4861-b45f-7574d8295fa1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87dfa09a-c8bd-4861-b45f-7574d8295fa1" (UID: "87dfa09a-c8bd-4861-b45f-7574d8295fa1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.753180 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bcd7-account-create-update-59qnr" event={"ID":"9da576af-ae63-447b-acaf-a6a7bfe96ddb","Type":"ContainerDied","Data":"7c7bb96bcd89a611f035e2e774ec7e7bef75ca88b6519031e7d26a4cd29afcda"} Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.753221 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c7bb96bcd89a611f035e2e774ec7e7bef75ca88b6519031e7d26a4cd29afcda" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.753285 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bcd7-account-create-update-59qnr" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.767245 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95be4d9a-c6a3-47ad-a63b-5437aff2cef1","Type":"ContainerStarted","Data":"55bf993e11c1a83cf98cbfe07a183c9a98aebe3c5046084aba26acac1f518ea9"} Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.771666 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-q6drx" event={"ID":"1488e9fc-9b52-4515-8fb3-980469c83ae8","Type":"ContainerDied","Data":"2fbb1aa8105f409a12e6a82644af4e5dd1ad00a5425a67aefcfb6debc0c8e154"} Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.771717 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fbb1aa8105f409a12e6a82644af4e5dd1ad00a5425a67aefcfb6debc0c8e154" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.771787 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-q6drx" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.784410 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87dfa09a-c8bd-4861-b45f-7574d8295fa1-kube-api-access-6p5hj" (OuterVolumeSpecName: "kube-api-access-6p5hj") pod "87dfa09a-c8bd-4861-b45f-7574d8295fa1" (UID: "87dfa09a-c8bd-4861-b45f-7574d8295fa1"). InnerVolumeSpecName "kube-api-access-6p5hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.789528 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-44ce-account-create-update-54r5x" event={"ID":"87dfa09a-c8bd-4861-b45f-7574d8295fa1","Type":"ContainerDied","Data":"e6556068d8ab303180972cc5e82ba55075094a67f2d6bd1b77d336ae9a173d01"} Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.789579 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6556068d8ab303180972cc5e82ba55075094a67f2d6bd1b77d336ae9a173d01" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.789647 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-44ce-account-create-update-54r5x" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.806785 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lhdf9" event={"ID":"efe43680-5772-43bd-9ca1-4cb0245cae49","Type":"ContainerDied","Data":"72599566a6b6ce38b82dd3fc8f160710e65ed0366de498553792a3cfd56d3741"} Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.806832 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72599566a6b6ce38b82dd3fc8f160710e65ed0366de498553792a3cfd56d3741" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.808289 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lhdf9" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.815800 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtnq5\" (UniqueName: \"kubernetes.io/projected/1488e9fc-9b52-4515-8fb3-980469c83ae8-kube-api-access-wtnq5\") pod \"1488e9fc-9b52-4515-8fb3-980469c83ae8\" (UID: \"1488e9fc-9b52-4515-8fb3-980469c83ae8\") " Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.815898 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnqfq\" (UniqueName: \"kubernetes.io/projected/efe43680-5772-43bd-9ca1-4cb0245cae49-kube-api-access-nnqfq\") pod \"efe43680-5772-43bd-9ca1-4cb0245cae49\" (UID: \"efe43680-5772-43bd-9ca1-4cb0245cae49\") " Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.816063 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efe43680-5772-43bd-9ca1-4cb0245cae49-operator-scripts\") pod \"efe43680-5772-43bd-9ca1-4cb0245cae49\" (UID: \"efe43680-5772-43bd-9ca1-4cb0245cae49\") " Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.816223 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1488e9fc-9b52-4515-8fb3-980469c83ae8-operator-scripts\") pod \"1488e9fc-9b52-4515-8fb3-980469c83ae8\" (UID: \"1488e9fc-9b52-4515-8fb3-980469c83ae8\") " Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.816764 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p5hj\" (UniqueName: \"kubernetes.io/projected/87dfa09a-c8bd-4861-b45f-7574d8295fa1-kube-api-access-6p5hj\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.816785 4757 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87dfa09a-c8bd-4861-b45f-7574d8295fa1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.820703 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1488e9fc-9b52-4515-8fb3-980469c83ae8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1488e9fc-9b52-4515-8fb3-980469c83ae8" (UID: "1488e9fc-9b52-4515-8fb3-980469c83ae8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.820712 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efe43680-5772-43bd-9ca1-4cb0245cae49-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efe43680-5772-43bd-9ca1-4cb0245cae49" (UID: "efe43680-5772-43bd-9ca1-4cb0245cae49"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.839230 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1488e9fc-9b52-4515-8fb3-980469c83ae8-kube-api-access-wtnq5" (OuterVolumeSpecName: "kube-api-access-wtnq5") pod "1488e9fc-9b52-4515-8fb3-980469c83ae8" (UID: "1488e9fc-9b52-4515-8fb3-980469c83ae8"). InnerVolumeSpecName "kube-api-access-wtnq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.839948 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efe43680-5772-43bd-9ca1-4cb0245cae49-kube-api-access-nnqfq" (OuterVolumeSpecName: "kube-api-access-nnqfq") pod "efe43680-5772-43bd-9ca1-4cb0245cae49" (UID: "efe43680-5772-43bd-9ca1-4cb0245cae49"). InnerVolumeSpecName "kube-api-access-nnqfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.919115 4757 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efe43680-5772-43bd-9ca1-4cb0245cae49-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.919164 4757 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1488e9fc-9b52-4515-8fb3-980469c83ae8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.919179 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtnq5\" (UniqueName: \"kubernetes.io/projected/1488e9fc-9b52-4515-8fb3-980469c83ae8-kube-api-access-wtnq5\") on node \"crc\" DevicePath \"\"" Dec 16 13:08:57 crc kubenswrapper[4757]: I1216 13:08:57.919194 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnqfq\" (UniqueName: \"kubernetes.io/projected/efe43680-5772-43bd-9ca1-4cb0245cae49-kube-api-access-nnqfq\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:01 crc kubenswrapper[4757]: I1216 13:09:01.146456 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:09:01 crc kubenswrapper[4757]: I1216 13:09:01.237642 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-544dfc5bc-8q666" Dec 16 13:09:01 crc kubenswrapper[4757]: I1216 13:09:01.468204 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:09:01 crc kubenswrapper[4757]: I1216 13:09:01.469372 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:09:01 crc kubenswrapper[4757]: I1216 13:09:01.581561 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:09:01 crc kubenswrapper[4757]: I1216 13:09:01.582469 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.432702 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bxc8x"] Dec 16 13:09:02 crc kubenswrapper[4757]: E1216 13:09:02.433113 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1488e9fc-9b52-4515-8fb3-980469c83ae8" containerName="mariadb-database-create" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.433126 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="1488e9fc-9b52-4515-8fb3-980469c83ae8" containerName="mariadb-database-create" Dec 16 13:09:02 crc kubenswrapper[4757]: E1216 13:09:02.433148 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efe43680-5772-43bd-9ca1-4cb0245cae49" containerName="mariadb-database-create" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.433157 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="efe43680-5772-43bd-9ca1-4cb0245cae49" containerName="mariadb-database-create" Dec 16 13:09:02 crc kubenswrapper[4757]: E1216 13:09:02.433172 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956cce89-a20b-448a-9cf2-16b3ddcafe10" containerName="mariadb-database-create" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.433179 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="956cce89-a20b-448a-9cf2-16b3ddcafe10" containerName="mariadb-database-create" Dec 16 13:09:02 crc kubenswrapper[4757]: E1216 13:09:02.433199 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87dfa09a-c8bd-4861-b45f-7574d8295fa1" containerName="mariadb-account-create-update" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.433205 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="87dfa09a-c8bd-4861-b45f-7574d8295fa1" containerName="mariadb-account-create-update" Dec 16 13:09:02 crc kubenswrapper[4757]: E1216 13:09:02.433220 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da576af-ae63-447b-acaf-a6a7bfe96ddb" containerName="mariadb-account-create-update" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.433225 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da576af-ae63-447b-acaf-a6a7bfe96ddb" containerName="mariadb-account-create-update" Dec 16 13:09:02 crc kubenswrapper[4757]: E1216 13:09:02.433236 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427479ca-18b6-4580-a4ef-f85bb4071c88" containerName="mariadb-account-create-update" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.433242 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="427479ca-18b6-4580-a4ef-f85bb4071c88" containerName="mariadb-account-create-update" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.433458 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="87dfa09a-c8bd-4861-b45f-7574d8295fa1" containerName="mariadb-account-create-update" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.433788 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="1488e9fc-9b52-4515-8fb3-980469c83ae8" containerName="mariadb-database-create" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.433806 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da576af-ae63-447b-acaf-a6a7bfe96ddb" containerName="mariadb-account-create-update" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.433817 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="956cce89-a20b-448a-9cf2-16b3ddcafe10" containerName="mariadb-database-create" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.433828 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="efe43680-5772-43bd-9ca1-4cb0245cae49" containerName="mariadb-database-create" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.433838 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="427479ca-18b6-4580-a4ef-f85bb4071c88" containerName="mariadb-account-create-update" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.434788 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bxc8x" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.442736 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.451873 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bxc8x"] Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.453500 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-55n4r" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.453755 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.505930 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-scripts\") pod \"nova-cell0-conductor-db-sync-bxc8x\" (UID: \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\") " pod="openstack/nova-cell0-conductor-db-sync-bxc8x" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.506102 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-config-data\") pod \"nova-cell0-conductor-db-sync-bxc8x\" (UID: \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\") " pod="openstack/nova-cell0-conductor-db-sync-bxc8x" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.506183 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvpnq\" (UniqueName: \"kubernetes.io/projected/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-kube-api-access-zvpnq\") pod \"nova-cell0-conductor-db-sync-bxc8x\" (UID: \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\") " pod="openstack/nova-cell0-conductor-db-sync-bxc8x" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.506258 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bxc8x\" (UID: \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\") " pod="openstack/nova-cell0-conductor-db-sync-bxc8x" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.608074 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvpnq\" (UniqueName: \"kubernetes.io/projected/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-kube-api-access-zvpnq\") pod \"nova-cell0-conductor-db-sync-bxc8x\" (UID: \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\") " pod="openstack/nova-cell0-conductor-db-sync-bxc8x" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.608594 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bxc8x\" (UID: \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\") " pod="openstack/nova-cell0-conductor-db-sync-bxc8x" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.608634 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-scripts\") pod \"nova-cell0-conductor-db-sync-bxc8x\" (UID: \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\") " pod="openstack/nova-cell0-conductor-db-sync-bxc8x" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.609377 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-config-data\") pod \"nova-cell0-conductor-db-sync-bxc8x\" (UID: \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\") " pod="openstack/nova-cell0-conductor-db-sync-bxc8x" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.615736 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bxc8x\" (UID: \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\") " pod="openstack/nova-cell0-conductor-db-sync-bxc8x" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.620102 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-scripts\") pod \"nova-cell0-conductor-db-sync-bxc8x\" (UID: \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\") " pod="openstack/nova-cell0-conductor-db-sync-bxc8x" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.624698 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-config-data\") pod \"nova-cell0-conductor-db-sync-bxc8x\" (UID: \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\") " pod="openstack/nova-cell0-conductor-db-sync-bxc8x" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.627692 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvpnq\" (UniqueName: \"kubernetes.io/projected/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-kube-api-access-zvpnq\") pod \"nova-cell0-conductor-db-sync-bxc8x\" (UID: \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\") " pod="openstack/nova-cell0-conductor-db-sync-bxc8x" Dec 16 13:09:02 crc kubenswrapper[4757]: I1216 13:09:02.754040 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bxc8x" Dec 16 13:09:04 crc kubenswrapper[4757]: I1216 13:09:04.472565 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 16 13:09:04 crc kubenswrapper[4757]: W1216 13:09:04.565979 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a6fdfdb_145b_460e_b8e9_9f44e9034f40.slice/crio-8244939b2a59a8a775a360ff6be86e4523e10a373787b2a7a70671dad3a40a7b WatchSource:0}: Error finding container 8244939b2a59a8a775a360ff6be86e4523e10a373787b2a7a70671dad3a40a7b: Status 404 returned error can't find the container with id 8244939b2a59a8a775a360ff6be86e4523e10a373787b2a7a70671dad3a40a7b Dec 16 13:09:04 crc kubenswrapper[4757]: I1216 13:09:04.577238 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bxc8x"] Dec 16 13:09:04 crc kubenswrapper[4757]: I1216 13:09:04.877025 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bxc8x" event={"ID":"8a6fdfdb-145b-460e-b8e9-9f44e9034f40","Type":"ContainerStarted","Data":"8244939b2a59a8a775a360ff6be86e4523e10a373787b2a7a70671dad3a40a7b"} Dec 16 13:09:04 crc kubenswrapper[4757]: I1216 13:09:04.881381 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95be4d9a-c6a3-47ad-a63b-5437aff2cef1","Type":"ContainerStarted","Data":"8fd00fb08ac19145ce574e054b0b06df5331ad2215da8463b3fbb2e7f7bf4db7"} Dec 16 13:09:07 crc kubenswrapper[4757]: I1216 13:09:07.916529 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95be4d9a-c6a3-47ad-a63b-5437aff2cef1","Type":"ContainerStarted","Data":"bf85e47025c899812c13054217ee538f05a5e362513ba98777a35d7a2aad6e96"} Dec 16 13:09:07 crc kubenswrapper[4757]: I1216 13:09:07.917385 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 13:09:07 crc kubenswrapper[4757]: I1216 13:09:07.919324 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"891886a7-6bbd-48b7-8460-a1467bae862a","Type":"ContainerStarted","Data":"d37143628a6ca5354947ff795569c6f2e4bed06ed1d766dfbeb4c7b24e2dec2a"} Dec 16 13:09:07 crc kubenswrapper[4757]: I1216 13:09:07.951343 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.027829582 podStartE2EDuration="14.951320864s" podCreationTimestamp="2025-12-16 13:08:53 +0000 UTC" firstStartedPulling="2025-12-16 13:08:55.137200511 +0000 UTC m=+1320.564944307" lastFinishedPulling="2025-12-16 13:09:07.060691793 +0000 UTC m=+1332.488435589" observedRunningTime="2025-12-16 13:09:07.946904836 +0000 UTC m=+1333.374648632" watchObservedRunningTime="2025-12-16 13:09:07.951320864 +0000 UTC m=+1333.379064670" Dec 16 13:09:08 crc kubenswrapper[4757]: I1216 13:09:08.946720 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.959852337 podStartE2EDuration="36.946700355s" podCreationTimestamp="2025-12-16 13:08:32 +0000 UTC" firstStartedPulling="2025-12-16 13:08:33.570329506 +0000 UTC m=+1298.998073302" lastFinishedPulling="2025-12-16 13:09:07.557177524 +0000 UTC m=+1332.984921320" observedRunningTime="2025-12-16 13:09:08.942088201 +0000 UTC m=+1334.369831997" watchObservedRunningTime="2025-12-16 13:09:08.946700355 +0000 UTC m=+1334.374444151" Dec 16 13:09:11 crc kubenswrapper[4757]: I1216 13:09:11.296592 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:09:11 crc kubenswrapper[4757]: I1216 13:09:11.298085 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerName="ceilometer-central-agent" containerID="cri-o://406c0942de168d812fc472612ca39c3837c22654f7118c2e3032d6cebeabda20" gracePeriod=30 Dec 16 13:09:11 crc kubenswrapper[4757]: I1216 13:09:11.298750 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerName="proxy-httpd" containerID="cri-o://bf85e47025c899812c13054217ee538f05a5e362513ba98777a35d7a2aad6e96" gracePeriod=30 Dec 16 13:09:11 crc kubenswrapper[4757]: I1216 13:09:11.298975 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerName="sg-core" containerID="cri-o://8fd00fb08ac19145ce574e054b0b06df5331ad2215da8463b3fbb2e7f7bf4db7" gracePeriod=30 Dec 16 13:09:11 crc kubenswrapper[4757]: I1216 13:09:11.299147 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerName="ceilometer-notification-agent" containerID="cri-o://55bf993e11c1a83cf98cbfe07a183c9a98aebe3c5046084aba26acac1f518ea9" gracePeriod=30 Dec 16 13:09:11 crc kubenswrapper[4757]: I1216 13:09:11.469924 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-75ccc7d896-jmrk9" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 16 13:09:11 crc kubenswrapper[4757]: I1216 13:09:11.583846 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5d66ddf65b-lmltr" podUID="65337bd1-c674-4817-91c2-ad150639205c" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 16 13:09:11 crc kubenswrapper[4757]: I1216 13:09:11.977624 4757 generic.go:334] "Generic (PLEG): container finished" podID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerID="bf85e47025c899812c13054217ee538f05a5e362513ba98777a35d7a2aad6e96" exitCode=0 Dec 16 13:09:11 crc kubenswrapper[4757]: I1216 13:09:11.978129 4757 generic.go:334] "Generic (PLEG): container finished" podID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerID="8fd00fb08ac19145ce574e054b0b06df5331ad2215da8463b3fbb2e7f7bf4db7" exitCode=2 Dec 16 13:09:11 crc kubenswrapper[4757]: I1216 13:09:11.978145 4757 generic.go:334] "Generic (PLEG): container finished" podID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerID="55bf993e11c1a83cf98cbfe07a183c9a98aebe3c5046084aba26acac1f518ea9" exitCode=0 Dec 16 13:09:11 crc kubenswrapper[4757]: I1216 13:09:11.978154 4757 generic.go:334] "Generic (PLEG): container finished" podID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerID="406c0942de168d812fc472612ca39c3837c22654f7118c2e3032d6cebeabda20" exitCode=0 Dec 16 13:09:11 crc kubenswrapper[4757]: I1216 13:09:11.977698 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95be4d9a-c6a3-47ad-a63b-5437aff2cef1","Type":"ContainerDied","Data":"bf85e47025c899812c13054217ee538f05a5e362513ba98777a35d7a2aad6e96"} Dec 16 13:09:11 crc kubenswrapper[4757]: I1216 13:09:11.978220 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95be4d9a-c6a3-47ad-a63b-5437aff2cef1","Type":"ContainerDied","Data":"8fd00fb08ac19145ce574e054b0b06df5331ad2215da8463b3fbb2e7f7bf4db7"} Dec 16 13:09:11 crc kubenswrapper[4757]: I1216 13:09:11.978239 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95be4d9a-c6a3-47ad-a63b-5437aff2cef1","Type":"ContainerDied","Data":"55bf993e11c1a83cf98cbfe07a183c9a98aebe3c5046084aba26acac1f518ea9"} Dec 16 13:09:11 crc kubenswrapper[4757]: I1216 13:09:11.978260 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95be4d9a-c6a3-47ad-a63b-5437aff2cef1","Type":"ContainerDied","Data":"406c0942de168d812fc472612ca39c3837c22654f7118c2e3032d6cebeabda20"} Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.469317 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-75ccc7d896-jmrk9" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 16 13:09:21 crc kubenswrapper[4757]: E1216 13:09:21.509618 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified" Dec 16 13:09:21 crc kubenswrapper[4757]: E1216 13:09:21.510155 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell0-conductor-db-sync,Image:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CELL_NAME,Value:cell0,ValueFrom:nil,},EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:false,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-conductor-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zvpnq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell0-conductor-db-sync-bxc8x_openstack(8a6fdfdb-145b-460e-b8e9-9f44e9034f40): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:09:21 crc kubenswrapper[4757]: E1216 13:09:21.511366 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/nova-cell0-conductor-db-sync-bxc8x" podUID="8a6fdfdb-145b-460e-b8e9-9f44e9034f40" Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.583455 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5d66ddf65b-lmltr" podUID="65337bd1-c674-4817-91c2-ad150639205c" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.597695 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.725524 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-combined-ca-bundle\") pod \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.725616 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-config-data\") pod \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.725691 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb96l\" (UniqueName: \"kubernetes.io/projected/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-kube-api-access-zb96l\") pod \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.725858 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-log-httpd\") pod \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.725919 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-run-httpd\") pod \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.726127 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-scripts\") pod \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.726212 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-sg-core-conf-yaml\") pod \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\" (UID: \"95be4d9a-c6a3-47ad-a63b-5437aff2cef1\") " Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.727032 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "95be4d9a-c6a3-47ad-a63b-5437aff2cef1" (UID: "95be4d9a-c6a3-47ad-a63b-5437aff2cef1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.727228 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "95be4d9a-c6a3-47ad-a63b-5437aff2cef1" (UID: "95be4d9a-c6a3-47ad-a63b-5437aff2cef1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.762243 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-kube-api-access-zb96l" (OuterVolumeSpecName: "kube-api-access-zb96l") pod "95be4d9a-c6a3-47ad-a63b-5437aff2cef1" (UID: "95be4d9a-c6a3-47ad-a63b-5437aff2cef1"). InnerVolumeSpecName "kube-api-access-zb96l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.767221 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-scripts" (OuterVolumeSpecName: "scripts") pod "95be4d9a-c6a3-47ad-a63b-5437aff2cef1" (UID: "95be4d9a-c6a3-47ad-a63b-5437aff2cef1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.795176 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "95be4d9a-c6a3-47ad-a63b-5437aff2cef1" (UID: "95be4d9a-c6a3-47ad-a63b-5437aff2cef1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.828834 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.829351 4757 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.829462 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb96l\" (UniqueName: \"kubernetes.io/projected/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-kube-api-access-zb96l\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.829551 4757 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.829641 4757 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.906128 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95be4d9a-c6a3-47ad-a63b-5437aff2cef1" (UID: "95be4d9a-c6a3-47ad-a63b-5437aff2cef1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.931271 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:21 crc kubenswrapper[4757]: I1216 13:09:21.935300 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-config-data" (OuterVolumeSpecName: "config-data") pod "95be4d9a-c6a3-47ad-a63b-5437aff2cef1" (UID: "95be4d9a-c6a3-47ad-a63b-5437aff2cef1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.034982 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95be4d9a-c6a3-47ad-a63b-5437aff2cef1-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.097538 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.098195 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95be4d9a-c6a3-47ad-a63b-5437aff2cef1","Type":"ContainerDied","Data":"5fb8ca3a88e94890d022ecf48677dc11e36545c02a3ac70bf92ecd07d5808ccb"} Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.098273 4757 scope.go:117] "RemoveContainer" containerID="bf85e47025c899812c13054217ee538f05a5e362513ba98777a35d7a2aad6e96" Dec 16 13:09:22 crc kubenswrapper[4757]: E1216 13:09:22.099201 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified\\\"\"" pod="openstack/nova-cell0-conductor-db-sync-bxc8x" podUID="8a6fdfdb-145b-460e-b8e9-9f44e9034f40" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.122344 4757 scope.go:117] "RemoveContainer" containerID="8fd00fb08ac19145ce574e054b0b06df5331ad2215da8463b3fbb2e7f7bf4db7" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.140778 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.147597 4757 scope.go:117] "RemoveContainer" containerID="55bf993e11c1a83cf98cbfe07a183c9a98aebe3c5046084aba26acac1f518ea9" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.151565 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.172091 4757 scope.go:117] "RemoveContainer" containerID="406c0942de168d812fc472612ca39c3837c22654f7118c2e3032d6cebeabda20" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.180943 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:09:22 crc kubenswrapper[4757]: E1216 13:09:22.181638 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerName="ceilometer-notification-agent" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.181733 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerName="ceilometer-notification-agent" Dec 16 13:09:22 crc kubenswrapper[4757]: E1216 13:09:22.181814 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerName="ceilometer-central-agent" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.181888 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerName="ceilometer-central-agent" Dec 16 13:09:22 crc kubenswrapper[4757]: E1216 13:09:22.181988 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerName="proxy-httpd" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.182195 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerName="proxy-httpd" Dec 16 13:09:22 crc kubenswrapper[4757]: E1216 13:09:22.182295 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerName="sg-core" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.182454 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerName="sg-core" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.182811 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerName="ceilometer-central-agent" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.182907 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerName="proxy-httpd" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.183057 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerName="ceilometer-notification-agent" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.183149 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" containerName="sg-core" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.185604 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.197256 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.198000 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.205425 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.243000 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-scripts\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.243375 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127bb431-45b6-4112-bce0-b1a388e9e40f-log-httpd\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.243404 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tp7q\" (UniqueName: \"kubernetes.io/projected/127bb431-45b6-4112-bce0-b1a388e9e40f-kube-api-access-7tp7q\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.243541 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.243626 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-config-data\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.243718 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.243745 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127bb431-45b6-4112-bce0-b1a388e9e40f-run-httpd\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.345024 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.345095 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-config-data\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.345162 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.345185 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127bb431-45b6-4112-bce0-b1a388e9e40f-run-httpd\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.345268 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-scripts\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.345289 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127bb431-45b6-4112-bce0-b1a388e9e40f-log-httpd\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.345311 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tp7q\" (UniqueName: \"kubernetes.io/projected/127bb431-45b6-4112-bce0-b1a388e9e40f-kube-api-access-7tp7q\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.346143 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127bb431-45b6-4112-bce0-b1a388e9e40f-run-httpd\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.347760 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127bb431-45b6-4112-bce0-b1a388e9e40f-log-httpd\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.349893 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.353855 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.354140 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-scripts\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.354454 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-config-data\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.364882 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tp7q\" (UniqueName: \"kubernetes.io/projected/127bb431-45b6-4112-bce0-b1a388e9e40f-kube-api-access-7tp7q\") pod \"ceilometer-0\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.531068 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:09:22 crc kubenswrapper[4757]: I1216 13:09:22.960786 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95be4d9a-c6a3-47ad-a63b-5437aff2cef1" path="/var/lib/kubelet/pods/95be4d9a-c6a3-47ad-a63b-5437aff2cef1/volumes" Dec 16 13:09:23 crc kubenswrapper[4757]: I1216 13:09:23.128559 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:09:23 crc kubenswrapper[4757]: W1216 13:09:23.138650 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod127bb431_45b6_4112_bce0_b1a388e9e40f.slice/crio-74e9f2bf8768fb04d34e67981b5594632d2dec760f122fb1cdc351380a4a1cc3 WatchSource:0}: Error finding container 74e9f2bf8768fb04d34e67981b5594632d2dec760f122fb1cdc351380a4a1cc3: Status 404 returned error can't find the container with id 74e9f2bf8768fb04d34e67981b5594632d2dec760f122fb1cdc351380a4a1cc3 Dec 16 13:09:23 crc kubenswrapper[4757]: I1216 13:09:23.626054 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:09:24 crc kubenswrapper[4757]: I1216 13:09:24.116100 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"127bb431-45b6-4112-bce0-b1a388e9e40f","Type":"ContainerStarted","Data":"74e9f2bf8768fb04d34e67981b5594632d2dec760f122fb1cdc351380a4a1cc3"} Dec 16 13:09:26 crc kubenswrapper[4757]: I1216 13:09:26.140405 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"127bb431-45b6-4112-bce0-b1a388e9e40f","Type":"ContainerStarted","Data":"7e31c5f71b738ce0da3336d294014f18f1dde655ff4ab65138b621bf967825e6"} Dec 16 13:09:28 crc kubenswrapper[4757]: I1216 13:09:28.171784 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"127bb431-45b6-4112-bce0-b1a388e9e40f","Type":"ContainerStarted","Data":"b52596c6a18e376b866be9b286a70bb13c08a28da95917163b58fdcea06596f1"} Dec 16 13:09:30 crc kubenswrapper[4757]: I1216 13:09:30.189719 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"127bb431-45b6-4112-bce0-b1a388e9e40f","Type":"ContainerStarted","Data":"015c2d98e38d8999166f06892860acdbf44dee5db79229b689c15c374d79f613"} Dec 16 13:09:31 crc kubenswrapper[4757]: I1216 13:09:31.468282 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-75ccc7d896-jmrk9" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 16 13:09:31 crc kubenswrapper[4757]: I1216 13:09:31.468673 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:09:31 crc kubenswrapper[4757]: I1216 13:09:31.469547 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"bc30dc5c1ec1bde48a9125e46de96e8a257135a5fb2342aa9590cfa8754f8773"} pod="openstack/horizon-75ccc7d896-jmrk9" containerMessage="Container horizon failed startup probe, will be restarted" Dec 16 13:09:31 crc kubenswrapper[4757]: I1216 13:09:31.469595 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-75ccc7d896-jmrk9" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" containerID="cri-o://bc30dc5c1ec1bde48a9125e46de96e8a257135a5fb2342aa9590cfa8754f8773" gracePeriod=30 Dec 16 13:09:31 crc kubenswrapper[4757]: I1216 13:09:31.582211 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5d66ddf65b-lmltr" podUID="65337bd1-c674-4817-91c2-ad150639205c" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 16 13:09:31 crc kubenswrapper[4757]: I1216 13:09:31.582334 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:09:31 crc kubenswrapper[4757]: I1216 13:09:31.583077 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"8f67125ddbcaa77814e69db0c0e1c32f2f3b513d08902bb37e5163bedc8c2aef"} pod="openstack/horizon-5d66ddf65b-lmltr" containerMessage="Container horizon failed startup probe, will be restarted" Dec 16 13:09:31 crc kubenswrapper[4757]: I1216 13:09:31.583113 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5d66ddf65b-lmltr" podUID="65337bd1-c674-4817-91c2-ad150639205c" containerName="horizon" containerID="cri-o://8f67125ddbcaa77814e69db0c0e1c32f2f3b513d08902bb37e5163bedc8c2aef" gracePeriod=30 Dec 16 13:09:34 crc kubenswrapper[4757]: I1216 13:09:34.233395 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"127bb431-45b6-4112-bce0-b1a388e9e40f","Type":"ContainerStarted","Data":"2fd5b2d7e8c06086d538412fcaf6f1cea2f5b73fa5897fbe35d93cfdb9e9a1a1"} Dec 16 13:09:34 crc kubenswrapper[4757]: I1216 13:09:34.234025 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 13:09:34 crc kubenswrapper[4757]: I1216 13:09:34.233624 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerName="sg-core" containerID="cri-o://015c2d98e38d8999166f06892860acdbf44dee5db79229b689c15c374d79f613" gracePeriod=30 Dec 16 13:09:34 crc kubenswrapper[4757]: I1216 13:09:34.233618 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerName="proxy-httpd" containerID="cri-o://2fd5b2d7e8c06086d538412fcaf6f1cea2f5b73fa5897fbe35d93cfdb9e9a1a1" gracePeriod=30 Dec 16 13:09:34 crc kubenswrapper[4757]: I1216 13:09:34.233603 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerName="ceilometer-notification-agent" containerID="cri-o://b52596c6a18e376b866be9b286a70bb13c08a28da95917163b58fdcea06596f1" gracePeriod=30 Dec 16 13:09:34 crc kubenswrapper[4757]: I1216 13:09:34.233536 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerName="ceilometer-central-agent" containerID="cri-o://7e31c5f71b738ce0da3336d294014f18f1dde655ff4ab65138b621bf967825e6" gracePeriod=30 Dec 16 13:09:34 crc kubenswrapper[4757]: I1216 13:09:34.270363 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.963201491 podStartE2EDuration="12.270336066s" podCreationTimestamp="2025-12-16 13:09:22 +0000 UTC" firstStartedPulling="2025-12-16 13:09:23.141288456 +0000 UTC m=+1348.569032252" lastFinishedPulling="2025-12-16 13:09:33.448423031 +0000 UTC m=+1358.876166827" observedRunningTime="2025-12-16 13:09:34.259970162 +0000 UTC m=+1359.687713958" watchObservedRunningTime="2025-12-16 13:09:34.270336066 +0000 UTC m=+1359.698079862" Dec 16 13:09:35 crc kubenswrapper[4757]: I1216 13:09:35.243974 4757 generic.go:334] "Generic (PLEG): container finished" podID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerID="2fd5b2d7e8c06086d538412fcaf6f1cea2f5b73fa5897fbe35d93cfdb9e9a1a1" exitCode=0 Dec 16 13:09:35 crc kubenswrapper[4757]: I1216 13:09:35.244303 4757 generic.go:334] "Generic (PLEG): container finished" podID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerID="015c2d98e38d8999166f06892860acdbf44dee5db79229b689c15c374d79f613" exitCode=2 Dec 16 13:09:35 crc kubenswrapper[4757]: I1216 13:09:35.244313 4757 generic.go:334] "Generic (PLEG): container finished" podID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerID="b52596c6a18e376b866be9b286a70bb13c08a28da95917163b58fdcea06596f1" exitCode=0 Dec 16 13:09:35 crc kubenswrapper[4757]: I1216 13:09:35.244150 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"127bb431-45b6-4112-bce0-b1a388e9e40f","Type":"ContainerDied","Data":"2fd5b2d7e8c06086d538412fcaf6f1cea2f5b73fa5897fbe35d93cfdb9e9a1a1"} Dec 16 13:09:35 crc kubenswrapper[4757]: I1216 13:09:35.244347 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"127bb431-45b6-4112-bce0-b1a388e9e40f","Type":"ContainerDied","Data":"015c2d98e38d8999166f06892860acdbf44dee5db79229b689c15c374d79f613"} Dec 16 13:09:35 crc kubenswrapper[4757]: I1216 13:09:35.244361 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"127bb431-45b6-4112-bce0-b1a388e9e40f","Type":"ContainerDied","Data":"b52596c6a18e376b866be9b286a70bb13c08a28da95917163b58fdcea06596f1"} Dec 16 13:09:35 crc kubenswrapper[4757]: I1216 13:09:35.597138 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 13:09:35 crc kubenswrapper[4757]: I1216 13:09:35.597854 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9f22fc73-c034-4c9d-8274-215e0ef2a208" containerName="glance-log" containerID="cri-o://f8d93c404b1b656f408c4baae8b376e616731c4e5531a4ad8f1763546e23f59d" gracePeriod=30 Dec 16 13:09:35 crc kubenswrapper[4757]: I1216 13:09:35.597999 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9f22fc73-c034-4c9d-8274-215e0ef2a208" containerName="glance-httpd" containerID="cri-o://49a74036aa1ba6dc9a0c5edfcbb46392c94d56a8c44ad631d4951f9cf16d3f78" gracePeriod=30 Dec 16 13:09:36 crc kubenswrapper[4757]: I1216 13:09:36.258887 4757 generic.go:334] "Generic (PLEG): container finished" podID="9f22fc73-c034-4c9d-8274-215e0ef2a208" containerID="f8d93c404b1b656f408c4baae8b376e616731c4e5531a4ad8f1763546e23f59d" exitCode=143 Dec 16 13:09:36 crc kubenswrapper[4757]: I1216 13:09:36.259055 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f22fc73-c034-4c9d-8274-215e0ef2a208","Type":"ContainerDied","Data":"f8d93c404b1b656f408c4baae8b376e616731c4e5531a4ad8f1763546e23f59d"} Dec 16 13:09:37 crc kubenswrapper[4757]: I1216 13:09:37.270898 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bxc8x" event={"ID":"8a6fdfdb-145b-460e-b8e9-9f44e9034f40","Type":"ContainerStarted","Data":"4c2346231f5aa5a3f75291af9b526f1c18d54ecfa10f7c241ed4613de9179a22"} Dec 16 13:09:37 crc kubenswrapper[4757]: I1216 13:09:37.288832 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-bxc8x" podStartSLOduration=3.13306852 podStartE2EDuration="35.288812521s" podCreationTimestamp="2025-12-16 13:09:02 +0000 UTC" firstStartedPulling="2025-12-16 13:09:04.568547771 +0000 UTC m=+1329.996291567" lastFinishedPulling="2025-12-16 13:09:36.724291772 +0000 UTC m=+1362.152035568" observedRunningTime="2025-12-16 13:09:37.285214353 +0000 UTC m=+1362.712958159" watchObservedRunningTime="2025-12-16 13:09:37.288812521 +0000 UTC m=+1362.716556317" Dec 16 13:09:38 crc kubenswrapper[4757]: I1216 13:09:38.357296 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 13:09:38 crc kubenswrapper[4757]: I1216 13:09:38.357892 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="da952307-39db-4816-8465-d931bd94436d" containerName="glance-log" containerID="cri-o://b2a06ac9d2ae490290b06680078382d77bf03896a450c8d9219e69834fa8ffaf" gracePeriod=30 Dec 16 13:09:38 crc kubenswrapper[4757]: I1216 13:09:38.358085 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="da952307-39db-4816-8465-d931bd94436d" containerName="glance-httpd" containerID="cri-o://6090910b49339dc35f9500d38541fc4e7f62155a18bb86a432bc3ce4ec1caaa3" gracePeriod=30 Dec 16 13:09:38 crc kubenswrapper[4757]: E1216 13:09:38.498496 4757 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda952307_39db_4816_8465_d931bd94436d.slice/crio-conmon-b2a06ac9d2ae490290b06680078382d77bf03896a450c8d9219e69834fa8ffaf.scope\": RecentStats: unable to find data in memory cache]" Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.291092 4757 generic.go:334] "Generic (PLEG): container finished" podID="9f22fc73-c034-4c9d-8274-215e0ef2a208" containerID="49a74036aa1ba6dc9a0c5edfcbb46392c94d56a8c44ad631d4951f9cf16d3f78" exitCode=0 Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.291344 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f22fc73-c034-4c9d-8274-215e0ef2a208","Type":"ContainerDied","Data":"49a74036aa1ba6dc9a0c5edfcbb46392c94d56a8c44ad631d4951f9cf16d3f78"} Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.292826 4757 generic.go:334] "Generic (PLEG): container finished" podID="da952307-39db-4816-8465-d931bd94436d" containerID="b2a06ac9d2ae490290b06680078382d77bf03896a450c8d9219e69834fa8ffaf" exitCode=143 Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.292854 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"da952307-39db-4816-8465-d931bd94436d","Type":"ContainerDied","Data":"b2a06ac9d2ae490290b06680078382d77bf03896a450c8d9219e69834fa8ffaf"} Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.526032 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.671475 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f22fc73-c034-4c9d-8274-215e0ef2a208-logs\") pod \"9f22fc73-c034-4c9d-8274-215e0ef2a208\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.671609 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-combined-ca-bundle\") pod \"9f22fc73-c034-4c9d-8274-215e0ef2a208\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.671648 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f22fc73-c034-4c9d-8274-215e0ef2a208-httpd-run\") pod \"9f22fc73-c034-4c9d-8274-215e0ef2a208\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.671681 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-config-data\") pod \"9f22fc73-c034-4c9d-8274-215e0ef2a208\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.671719 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"9f22fc73-c034-4c9d-8274-215e0ef2a208\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.671745 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-public-tls-certs\") pod \"9f22fc73-c034-4c9d-8274-215e0ef2a208\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.671837 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flk28\" (UniqueName: \"kubernetes.io/projected/9f22fc73-c034-4c9d-8274-215e0ef2a208-kube-api-access-flk28\") pod \"9f22fc73-c034-4c9d-8274-215e0ef2a208\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.671890 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-scripts\") pod \"9f22fc73-c034-4c9d-8274-215e0ef2a208\" (UID: \"9f22fc73-c034-4c9d-8274-215e0ef2a208\") " Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.680963 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f22fc73-c034-4c9d-8274-215e0ef2a208-kube-api-access-flk28" (OuterVolumeSpecName: "kube-api-access-flk28") pod "9f22fc73-c034-4c9d-8274-215e0ef2a208" (UID: "9f22fc73-c034-4c9d-8274-215e0ef2a208"). InnerVolumeSpecName "kube-api-access-flk28". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.681622 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-scripts" (OuterVolumeSpecName: "scripts") pod "9f22fc73-c034-4c9d-8274-215e0ef2a208" (UID: "9f22fc73-c034-4c9d-8274-215e0ef2a208"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.681937 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f22fc73-c034-4c9d-8274-215e0ef2a208-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9f22fc73-c034-4c9d-8274-215e0ef2a208" (UID: "9f22fc73-c034-4c9d-8274-215e0ef2a208"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.695666 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f22fc73-c034-4c9d-8274-215e0ef2a208-logs" (OuterVolumeSpecName: "logs") pod "9f22fc73-c034-4c9d-8274-215e0ef2a208" (UID: "9f22fc73-c034-4c9d-8274-215e0ef2a208"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.705114 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "9f22fc73-c034-4c9d-8274-215e0ef2a208" (UID: "9f22fc73-c034-4c9d-8274-215e0ef2a208"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.756332 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f22fc73-c034-4c9d-8274-215e0ef2a208" (UID: "9f22fc73-c034-4c9d-8274-215e0ef2a208"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.775650 4757 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.775690 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flk28\" (UniqueName: \"kubernetes.io/projected/9f22fc73-c034-4c9d-8274-215e0ef2a208-kube-api-access-flk28\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.775703 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.775714 4757 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f22fc73-c034-4c9d-8274-215e0ef2a208-logs\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.775724 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.775734 4757 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f22fc73-c034-4c9d-8274-215e0ef2a208-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.812955 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9f22fc73-c034-4c9d-8274-215e0ef2a208" (UID: "9f22fc73-c034-4c9d-8274-215e0ef2a208"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.821761 4757 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.826195 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-config-data" (OuterVolumeSpecName: "config-data") pod "9f22fc73-c034-4c9d-8274-215e0ef2a208" (UID: "9f22fc73-c034-4c9d-8274-215e0ef2a208"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.877341 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.877384 4757 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:39 crc kubenswrapper[4757]: I1216 13:09:39.877396 4757 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f22fc73-c034-4c9d-8274-215e0ef2a208-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.302492 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9f22fc73-c034-4c9d-8274-215e0ef2a208","Type":"ContainerDied","Data":"2c7545ccd06e40a0e3703e23eb33f7a3c40e2215e88f6a67d1cacfcb92b07af9"} Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.302540 4757 scope.go:117] "RemoveContainer" containerID="49a74036aa1ba6dc9a0c5edfcbb46392c94d56a8c44ad631d4951f9cf16d3f78" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.302650 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.349095 4757 scope.go:117] "RemoveContainer" containerID="f8d93c404b1b656f408c4baae8b376e616731c4e5531a4ad8f1763546e23f59d" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.352086 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.375780 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.400067 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 13:09:40 crc kubenswrapper[4757]: E1216 13:09:40.400466 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f22fc73-c034-4c9d-8274-215e0ef2a208" containerName="glance-log" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.400483 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f22fc73-c034-4c9d-8274-215e0ef2a208" containerName="glance-log" Dec 16 13:09:40 crc kubenswrapper[4757]: E1216 13:09:40.400516 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f22fc73-c034-4c9d-8274-215e0ef2a208" containerName="glance-httpd" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.400522 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f22fc73-c034-4c9d-8274-215e0ef2a208" containerName="glance-httpd" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.400672 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f22fc73-c034-4c9d-8274-215e0ef2a208" containerName="glance-httpd" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.400705 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f22fc73-c034-4c9d-8274-215e0ef2a208" containerName="glance-log" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.402035 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.425041 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.429671 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.429873 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.492655 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg7zh\" (UniqueName: \"kubernetes.io/projected/40de399c-634b-4d44-a9ca-0aec62a9088b-kube-api-access-gg7zh\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.492705 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40de399c-634b-4d44-a9ca-0aec62a9088b-config-data\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.492922 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40de399c-634b-4d44-a9ca-0aec62a9088b-logs\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.492968 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40de399c-634b-4d44-a9ca-0aec62a9088b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.493023 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40de399c-634b-4d44-a9ca-0aec62a9088b-scripts\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.493108 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/40de399c-634b-4d44-a9ca-0aec62a9088b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.493293 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40de399c-634b-4d44-a9ca-0aec62a9088b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.493365 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.594508 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/40de399c-634b-4d44-a9ca-0aec62a9088b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.595660 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40de399c-634b-4d44-a9ca-0aec62a9088b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.595295 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/40de399c-634b-4d44-a9ca-0aec62a9088b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.595698 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.595842 4757 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.596162 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg7zh\" (UniqueName: \"kubernetes.io/projected/40de399c-634b-4d44-a9ca-0aec62a9088b-kube-api-access-gg7zh\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.596250 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40de399c-634b-4d44-a9ca-0aec62a9088b-config-data\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.596554 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40de399c-634b-4d44-a9ca-0aec62a9088b-logs\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.597107 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40de399c-634b-4d44-a9ca-0aec62a9088b-logs\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.597176 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40de399c-634b-4d44-a9ca-0aec62a9088b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.597230 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40de399c-634b-4d44-a9ca-0aec62a9088b-scripts\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.603670 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40de399c-634b-4d44-a9ca-0aec62a9088b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.604388 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40de399c-634b-4d44-a9ca-0aec62a9088b-config-data\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.605693 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40de399c-634b-4d44-a9ca-0aec62a9088b-scripts\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.606409 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40de399c-634b-4d44-a9ca-0aec62a9088b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.617409 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg7zh\" (UniqueName: \"kubernetes.io/projected/40de399c-634b-4d44-a9ca-0aec62a9088b-kube-api-access-gg7zh\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.636166 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"40de399c-634b-4d44-a9ca-0aec62a9088b\") " pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.746266 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 13:09:40 crc kubenswrapper[4757]: I1216 13:09:40.976743 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f22fc73-c034-4c9d-8274-215e0ef2a208" path="/var/lib/kubelet/pods/9f22fc73-c034-4c9d-8274-215e0ef2a208/volumes" Dec 16 13:09:41 crc kubenswrapper[4757]: W1216 13:09:41.482848 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40de399c_634b_4d44_a9ca_0aec62a9088b.slice/crio-53f085e2a0385d44e42d3610f593ed2e758638feec2095578ecac17708a3885a WatchSource:0}: Error finding container 53f085e2a0385d44e42d3610f593ed2e758638feec2095578ecac17708a3885a: Status 404 returned error can't find the container with id 53f085e2a0385d44e42d3610f593ed2e758638feec2095578ecac17708a3885a Dec 16 13:09:41 crc kubenswrapper[4757]: I1216 13:09:41.497421 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.353320 4757 generic.go:334] "Generic (PLEG): container finished" podID="da952307-39db-4816-8465-d931bd94436d" containerID="6090910b49339dc35f9500d38541fc4e7f62155a18bb86a432bc3ce4ec1caaa3" exitCode=0 Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.353618 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"da952307-39db-4816-8465-d931bd94436d","Type":"ContainerDied","Data":"6090910b49339dc35f9500d38541fc4e7f62155a18bb86a432bc3ce4ec1caaa3"} Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.356233 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"40de399c-634b-4d44-a9ca-0aec62a9088b","Type":"ContainerStarted","Data":"53f085e2a0385d44e42d3610f593ed2e758638feec2095578ecac17708a3885a"} Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.624956 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.767882 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dkpg\" (UniqueName: \"kubernetes.io/projected/da952307-39db-4816-8465-d931bd94436d-kube-api-access-4dkpg\") pod \"da952307-39db-4816-8465-d931bd94436d\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.767918 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da952307-39db-4816-8465-d931bd94436d-logs\") pod \"da952307-39db-4816-8465-d931bd94436d\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.769130 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da952307-39db-4816-8465-d931bd94436d-logs" (OuterVolumeSpecName: "logs") pod "da952307-39db-4816-8465-d931bd94436d" (UID: "da952307-39db-4816-8465-d931bd94436d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.769193 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-combined-ca-bundle\") pod \"da952307-39db-4816-8465-d931bd94436d\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.769791 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-scripts\") pod \"da952307-39db-4816-8465-d931bd94436d\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.770070 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-internal-tls-certs\") pod \"da952307-39db-4816-8465-d931bd94436d\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.770317 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"da952307-39db-4816-8465-d931bd94436d\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.770673 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/da952307-39db-4816-8465-d931bd94436d-httpd-run\") pod \"da952307-39db-4816-8465-d931bd94436d\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.770789 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-config-data\") pod \"da952307-39db-4816-8465-d931bd94436d\" (UID: \"da952307-39db-4816-8465-d931bd94436d\") " Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.771385 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da952307-39db-4816-8465-d931bd94436d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "da952307-39db-4816-8465-d931bd94436d" (UID: "da952307-39db-4816-8465-d931bd94436d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.771994 4757 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da952307-39db-4816-8465-d931bd94436d-logs\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.777239 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da952307-39db-4816-8465-d931bd94436d-kube-api-access-4dkpg" (OuterVolumeSpecName: "kube-api-access-4dkpg") pod "da952307-39db-4816-8465-d931bd94436d" (UID: "da952307-39db-4816-8465-d931bd94436d"). InnerVolumeSpecName "kube-api-access-4dkpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.777605 4757 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/da952307-39db-4816-8465-d931bd94436d-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.790440 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-scripts" (OuterVolumeSpecName: "scripts") pod "da952307-39db-4816-8465-d931bd94436d" (UID: "da952307-39db-4816-8465-d931bd94436d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.798335 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "da952307-39db-4816-8465-d931bd94436d" (UID: "da952307-39db-4816-8465-d931bd94436d"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.849022 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da952307-39db-4816-8465-d931bd94436d" (UID: "da952307-39db-4816-8465-d931bd94436d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.882455 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dkpg\" (UniqueName: \"kubernetes.io/projected/da952307-39db-4816-8465-d931bd94436d-kube-api-access-4dkpg\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.882499 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.882514 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.882552 4757 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.916805 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-config-data" (OuterVolumeSpecName: "config-data") pod "da952307-39db-4816-8465-d931bd94436d" (UID: "da952307-39db-4816-8465-d931bd94436d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.944900 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "da952307-39db-4816-8465-d931bd94436d" (UID: "da952307-39db-4816-8465-d931bd94436d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.961056 4757 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.985620 4757 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.985673 4757 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:42 crc kubenswrapper[4757]: I1216 13:09:42.985688 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da952307-39db-4816-8465-d931bd94436d-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.369790 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"40de399c-634b-4d44-a9ca-0aec62a9088b","Type":"ContainerStarted","Data":"2be30502a46f0b092c6c1264bb12307167f7a37e0444e0c057db77196d442893"} Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.369841 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"40de399c-634b-4d44-a9ca-0aec62a9088b","Type":"ContainerStarted","Data":"325b3532342f0f99fa994536cdf8dfe6147713f79d64c401d962828caa798a42"} Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.372908 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"da952307-39db-4816-8465-d931bd94436d","Type":"ContainerDied","Data":"bce010abd62e016254d72186f5a6337abfac7af02831fadc656261a40cb29f0f"} Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.372943 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.372981 4757 scope.go:117] "RemoveContainer" containerID="6090910b49339dc35f9500d38541fc4e7f62155a18bb86a432bc3ce4ec1caaa3" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.398294 4757 scope.go:117] "RemoveContainer" containerID="b2a06ac9d2ae490290b06680078382d77bf03896a450c8d9219e69834fa8ffaf" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.404941 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.404916964 podStartE2EDuration="3.404916964s" podCreationTimestamp="2025-12-16 13:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:09:43.392611012 +0000 UTC m=+1368.820354828" watchObservedRunningTime="2025-12-16 13:09:43.404916964 +0000 UTC m=+1368.832660770" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.420926 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.431371 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.458550 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 13:09:43 crc kubenswrapper[4757]: E1216 13:09:43.458986 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da952307-39db-4816-8465-d931bd94436d" containerName="glance-log" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.459023 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="da952307-39db-4816-8465-d931bd94436d" containerName="glance-log" Dec 16 13:09:43 crc kubenswrapper[4757]: E1216 13:09:43.459050 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da952307-39db-4816-8465-d931bd94436d" containerName="glance-httpd" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.459064 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="da952307-39db-4816-8465-d931bd94436d" containerName="glance-httpd" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.459826 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="da952307-39db-4816-8465-d931bd94436d" containerName="glance-log" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.459878 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="da952307-39db-4816-8465-d931bd94436d" containerName="glance-httpd" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.466697 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.469790 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.469951 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.492503 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.600570 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e5b2048-f283-4bad-a57a-ae09865c33f2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.600655 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5b2048-f283-4bad-a57a-ae09865c33f2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.600741 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4e5b2048-f283-4bad-a57a-ae09865c33f2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.600860 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e5b2048-f283-4bad-a57a-ae09865c33f2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.600918 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e5b2048-f283-4bad-a57a-ae09865c33f2-logs\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.600966 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.601237 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl4sn\" (UniqueName: \"kubernetes.io/projected/4e5b2048-f283-4bad-a57a-ae09865c33f2-kube-api-access-xl4sn\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.601458 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e5b2048-f283-4bad-a57a-ae09865c33f2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.703454 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5b2048-f283-4bad-a57a-ae09865c33f2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.703520 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4e5b2048-f283-4bad-a57a-ae09865c33f2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.703575 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e5b2048-f283-4bad-a57a-ae09865c33f2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.703972 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4e5b2048-f283-4bad-a57a-ae09865c33f2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.704069 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e5b2048-f283-4bad-a57a-ae09865c33f2-logs\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.704096 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.704185 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl4sn\" (UniqueName: \"kubernetes.io/projected/4e5b2048-f283-4bad-a57a-ae09865c33f2-kube-api-access-xl4sn\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.704290 4757 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.704537 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e5b2048-f283-4bad-a57a-ae09865c33f2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.704706 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e5b2048-f283-4bad-a57a-ae09865c33f2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.704601 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e5b2048-f283-4bad-a57a-ae09865c33f2-logs\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.712882 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e5b2048-f283-4bad-a57a-ae09865c33f2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.713227 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e5b2048-f283-4bad-a57a-ae09865c33f2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.715302 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e5b2048-f283-4bad-a57a-ae09865c33f2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.723851 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl4sn\" (UniqueName: \"kubernetes.io/projected/4e5b2048-f283-4bad-a57a-ae09865c33f2-kube-api-access-xl4sn\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.724608 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5b2048-f283-4bad-a57a-ae09865c33f2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.735081 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"4e5b2048-f283-4bad-a57a-ae09865c33f2\") " pod="openstack/glance-default-internal-api-0" Dec 16 13:09:43 crc kubenswrapper[4757]: I1216 13:09:43.784362 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 13:09:44 crc kubenswrapper[4757]: W1216 13:09:44.462985 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e5b2048_f283_4bad_a57a_ae09865c33f2.slice/crio-af17f89b39effe40cc5b84130153c179d1f4e4a4ab7c3822e913370b88e385ae WatchSource:0}: Error finding container af17f89b39effe40cc5b84130153c179d1f4e4a4ab7c3822e913370b88e385ae: Status 404 returned error can't find the container with id af17f89b39effe40cc5b84130153c179d1f4e4a4ab7c3822e913370b88e385ae Dec 16 13:09:44 crc kubenswrapper[4757]: I1216 13:09:44.464679 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.073081 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da952307-39db-4816-8465-d931bd94436d" path="/var/lib/kubelet/pods/da952307-39db-4816-8465-d931bd94436d/volumes" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.258776 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.350492 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127bb431-45b6-4112-bce0-b1a388e9e40f-log-httpd\") pod \"127bb431-45b6-4112-bce0-b1a388e9e40f\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.350551 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127bb431-45b6-4112-bce0-b1a388e9e40f-run-httpd\") pod \"127bb431-45b6-4112-bce0-b1a388e9e40f\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.350652 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-combined-ca-bundle\") pod \"127bb431-45b6-4112-bce0-b1a388e9e40f\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.350702 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-scripts\") pod \"127bb431-45b6-4112-bce0-b1a388e9e40f\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.350774 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tp7q\" (UniqueName: \"kubernetes.io/projected/127bb431-45b6-4112-bce0-b1a388e9e40f-kube-api-access-7tp7q\") pod \"127bb431-45b6-4112-bce0-b1a388e9e40f\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.350903 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-config-data\") pod \"127bb431-45b6-4112-bce0-b1a388e9e40f\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.350964 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-sg-core-conf-yaml\") pod \"127bb431-45b6-4112-bce0-b1a388e9e40f\" (UID: \"127bb431-45b6-4112-bce0-b1a388e9e40f\") " Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.352492 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/127bb431-45b6-4112-bce0-b1a388e9e40f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "127bb431-45b6-4112-bce0-b1a388e9e40f" (UID: "127bb431-45b6-4112-bce0-b1a388e9e40f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.353829 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/127bb431-45b6-4112-bce0-b1a388e9e40f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "127bb431-45b6-4112-bce0-b1a388e9e40f" (UID: "127bb431-45b6-4112-bce0-b1a388e9e40f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.382103 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/127bb431-45b6-4112-bce0-b1a388e9e40f-kube-api-access-7tp7q" (OuterVolumeSpecName: "kube-api-access-7tp7q") pod "127bb431-45b6-4112-bce0-b1a388e9e40f" (UID: "127bb431-45b6-4112-bce0-b1a388e9e40f"). InnerVolumeSpecName "kube-api-access-7tp7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.386116 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-scripts" (OuterVolumeSpecName: "scripts") pod "127bb431-45b6-4112-bce0-b1a388e9e40f" (UID: "127bb431-45b6-4112-bce0-b1a388e9e40f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.438570 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4e5b2048-f283-4bad-a57a-ae09865c33f2","Type":"ContainerStarted","Data":"af17f89b39effe40cc5b84130153c179d1f4e4a4ab7c3822e913370b88e385ae"} Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.454082 4757 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127bb431-45b6-4112-bce0-b1a388e9e40f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.459665 4757 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127bb431-45b6-4112-bce0-b1a388e9e40f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.459754 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.459839 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tp7q\" (UniqueName: \"kubernetes.io/projected/127bb431-45b6-4112-bce0-b1a388e9e40f-kube-api-access-7tp7q\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.467908 4757 generic.go:334] "Generic (PLEG): container finished" podID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerID="7e31c5f71b738ce0da3336d294014f18f1dde655ff4ab65138b621bf967825e6" exitCode=0 Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.469100 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"127bb431-45b6-4112-bce0-b1a388e9e40f","Type":"ContainerDied","Data":"7e31c5f71b738ce0da3336d294014f18f1dde655ff4ab65138b621bf967825e6"} Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.470467 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"127bb431-45b6-4112-bce0-b1a388e9e40f","Type":"ContainerDied","Data":"74e9f2bf8768fb04d34e67981b5594632d2dec760f122fb1cdc351380a4a1cc3"} Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.470646 4757 scope.go:117] "RemoveContainer" containerID="2fd5b2d7e8c06086d538412fcaf6f1cea2f5b73fa5897fbe35d93cfdb9e9a1a1" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.469217 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.524513 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "127bb431-45b6-4112-bce0-b1a388e9e40f" (UID: "127bb431-45b6-4112-bce0-b1a388e9e40f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.555516 4757 scope.go:117] "RemoveContainer" containerID="015c2d98e38d8999166f06892860acdbf44dee5db79229b689c15c374d79f613" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.561905 4757 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.569758 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-config-data" (OuterVolumeSpecName: "config-data") pod "127bb431-45b6-4112-bce0-b1a388e9e40f" (UID: "127bb431-45b6-4112-bce0-b1a388e9e40f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.580232 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "127bb431-45b6-4112-bce0-b1a388e9e40f" (UID: "127bb431-45b6-4112-bce0-b1a388e9e40f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.580250 4757 scope.go:117] "RemoveContainer" containerID="b52596c6a18e376b866be9b286a70bb13c08a28da95917163b58fdcea06596f1" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.605990 4757 scope.go:117] "RemoveContainer" containerID="7e31c5f71b738ce0da3336d294014f18f1dde655ff4ab65138b621bf967825e6" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.626849 4757 scope.go:117] "RemoveContainer" containerID="2fd5b2d7e8c06086d538412fcaf6f1cea2f5b73fa5897fbe35d93cfdb9e9a1a1" Dec 16 13:09:45 crc kubenswrapper[4757]: E1216 13:09:45.627495 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fd5b2d7e8c06086d538412fcaf6f1cea2f5b73fa5897fbe35d93cfdb9e9a1a1\": container with ID starting with 2fd5b2d7e8c06086d538412fcaf6f1cea2f5b73fa5897fbe35d93cfdb9e9a1a1 not found: ID does not exist" containerID="2fd5b2d7e8c06086d538412fcaf6f1cea2f5b73fa5897fbe35d93cfdb9e9a1a1" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.627567 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fd5b2d7e8c06086d538412fcaf6f1cea2f5b73fa5897fbe35d93cfdb9e9a1a1"} err="failed to get container status \"2fd5b2d7e8c06086d538412fcaf6f1cea2f5b73fa5897fbe35d93cfdb9e9a1a1\": rpc error: code = NotFound desc = could not find container \"2fd5b2d7e8c06086d538412fcaf6f1cea2f5b73fa5897fbe35d93cfdb9e9a1a1\": container with ID starting with 2fd5b2d7e8c06086d538412fcaf6f1cea2f5b73fa5897fbe35d93cfdb9e9a1a1 not found: ID does not exist" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.627624 4757 scope.go:117] "RemoveContainer" containerID="015c2d98e38d8999166f06892860acdbf44dee5db79229b689c15c374d79f613" Dec 16 13:09:45 crc kubenswrapper[4757]: E1216 13:09:45.629426 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"015c2d98e38d8999166f06892860acdbf44dee5db79229b689c15c374d79f613\": container with ID starting with 015c2d98e38d8999166f06892860acdbf44dee5db79229b689c15c374d79f613 not found: ID does not exist" containerID="015c2d98e38d8999166f06892860acdbf44dee5db79229b689c15c374d79f613" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.629463 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015c2d98e38d8999166f06892860acdbf44dee5db79229b689c15c374d79f613"} err="failed to get container status \"015c2d98e38d8999166f06892860acdbf44dee5db79229b689c15c374d79f613\": rpc error: code = NotFound desc = could not find container \"015c2d98e38d8999166f06892860acdbf44dee5db79229b689c15c374d79f613\": container with ID starting with 015c2d98e38d8999166f06892860acdbf44dee5db79229b689c15c374d79f613 not found: ID does not exist" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.629487 4757 scope.go:117] "RemoveContainer" containerID="b52596c6a18e376b866be9b286a70bb13c08a28da95917163b58fdcea06596f1" Dec 16 13:09:45 crc kubenswrapper[4757]: E1216 13:09:45.629742 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b52596c6a18e376b866be9b286a70bb13c08a28da95917163b58fdcea06596f1\": container with ID starting with b52596c6a18e376b866be9b286a70bb13c08a28da95917163b58fdcea06596f1 not found: ID does not exist" containerID="b52596c6a18e376b866be9b286a70bb13c08a28da95917163b58fdcea06596f1" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.629773 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b52596c6a18e376b866be9b286a70bb13c08a28da95917163b58fdcea06596f1"} err="failed to get container status \"b52596c6a18e376b866be9b286a70bb13c08a28da95917163b58fdcea06596f1\": rpc error: code = NotFound desc = could not find container \"b52596c6a18e376b866be9b286a70bb13c08a28da95917163b58fdcea06596f1\": container with ID starting with b52596c6a18e376b866be9b286a70bb13c08a28da95917163b58fdcea06596f1 not found: ID does not exist" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.629791 4757 scope.go:117] "RemoveContainer" containerID="7e31c5f71b738ce0da3336d294014f18f1dde655ff4ab65138b621bf967825e6" Dec 16 13:09:45 crc kubenswrapper[4757]: E1216 13:09:45.630036 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e31c5f71b738ce0da3336d294014f18f1dde655ff4ab65138b621bf967825e6\": container with ID starting with 7e31c5f71b738ce0da3336d294014f18f1dde655ff4ab65138b621bf967825e6 not found: ID does not exist" containerID="7e31c5f71b738ce0da3336d294014f18f1dde655ff4ab65138b621bf967825e6" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.630066 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e31c5f71b738ce0da3336d294014f18f1dde655ff4ab65138b621bf967825e6"} err="failed to get container status \"7e31c5f71b738ce0da3336d294014f18f1dde655ff4ab65138b621bf967825e6\": rpc error: code = NotFound desc = could not find container \"7e31c5f71b738ce0da3336d294014f18f1dde655ff4ab65138b621bf967825e6\": container with ID starting with 7e31c5f71b738ce0da3336d294014f18f1dde655ff4ab65138b621bf967825e6 not found: ID does not exist" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.664125 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.664168 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/127bb431-45b6-4112-bce0-b1a388e9e40f-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.835198 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.857049 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.874263 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:09:45 crc kubenswrapper[4757]: E1216 13:09:45.874623 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerName="proxy-httpd" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.874636 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerName="proxy-httpd" Dec 16 13:09:45 crc kubenswrapper[4757]: E1216 13:09:45.874662 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerName="ceilometer-notification-agent" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.874670 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerName="ceilometer-notification-agent" Dec 16 13:09:45 crc kubenswrapper[4757]: E1216 13:09:45.874681 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerName="sg-core" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.874689 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerName="sg-core" Dec 16 13:09:45 crc kubenswrapper[4757]: E1216 13:09:45.874705 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerName="ceilometer-central-agent" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.874714 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerName="ceilometer-central-agent" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.874871 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerName="sg-core" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.874891 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerName="ceilometer-notification-agent" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.874908 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerName="ceilometer-central-agent" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.874919 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="127bb431-45b6-4112-bce0-b1a388e9e40f" containerName="proxy-httpd" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.876572 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.883901 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.884606 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.891563 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.971853 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-scripts\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.971957 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.971991 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddca11db-397f-4375-965c-4ab9644469f5-log-httpd\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.972142 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb5g8\" (UniqueName: \"kubernetes.io/projected/ddca11db-397f-4375-965c-4ab9644469f5-kube-api-access-jb5g8\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.972183 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.972244 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddca11db-397f-4375-965c-4ab9644469f5-run-httpd\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:45 crc kubenswrapper[4757]: I1216 13:09:45.972268 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-config-data\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:46 crc kubenswrapper[4757]: I1216 13:09:46.073478 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddca11db-397f-4375-965c-4ab9644469f5-run-httpd\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:46 crc kubenswrapper[4757]: I1216 13:09:46.073552 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-config-data\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:46 crc kubenswrapper[4757]: I1216 13:09:46.073638 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-scripts\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:46 crc kubenswrapper[4757]: I1216 13:09:46.073719 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:46 crc kubenswrapper[4757]: I1216 13:09:46.073754 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddca11db-397f-4375-965c-4ab9644469f5-log-httpd\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:46 crc kubenswrapper[4757]: I1216 13:09:46.073865 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb5g8\" (UniqueName: \"kubernetes.io/projected/ddca11db-397f-4375-965c-4ab9644469f5-kube-api-access-jb5g8\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:46 crc kubenswrapper[4757]: I1216 13:09:46.073897 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:46 crc kubenswrapper[4757]: I1216 13:09:46.075748 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddca11db-397f-4375-965c-4ab9644469f5-log-httpd\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:46 crc kubenswrapper[4757]: I1216 13:09:46.076351 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddca11db-397f-4375-965c-4ab9644469f5-run-httpd\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:46 crc kubenswrapper[4757]: I1216 13:09:46.083790 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-scripts\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:46 crc kubenswrapper[4757]: I1216 13:09:46.087270 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:46 crc kubenswrapper[4757]: I1216 13:09:46.106757 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:46 crc kubenswrapper[4757]: I1216 13:09:46.108849 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb5g8\" (UniqueName: \"kubernetes.io/projected/ddca11db-397f-4375-965c-4ab9644469f5-kube-api-access-jb5g8\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:46 crc kubenswrapper[4757]: I1216 13:09:46.112616 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-config-data\") pod \"ceilometer-0\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " pod="openstack/ceilometer-0" Dec 16 13:09:46 crc kubenswrapper[4757]: I1216 13:09:46.212292 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:09:46 crc kubenswrapper[4757]: I1216 13:09:46.491571 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4e5b2048-f283-4bad-a57a-ae09865c33f2","Type":"ContainerStarted","Data":"1619851f8f244d478f61fba343fe73eaf401f007dcf1187c4ee0250c49f9685d"} Dec 16 13:09:46 crc kubenswrapper[4757]: I1216 13:09:46.813749 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:09:46 crc kubenswrapper[4757]: I1216 13:09:46.960609 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="127bb431-45b6-4112-bce0-b1a388e9e40f" path="/var/lib/kubelet/pods/127bb431-45b6-4112-bce0-b1a388e9e40f/volumes" Dec 16 13:09:47 crc kubenswrapper[4757]: I1216 13:09:47.506396 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4e5b2048-f283-4bad-a57a-ae09865c33f2","Type":"ContainerStarted","Data":"0df1189b71d5b86e0199d042f6b5cd94bf198c94be28ab41b82f4bf1fbfd90b3"} Dec 16 13:09:47 crc kubenswrapper[4757]: I1216 13:09:47.509024 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddca11db-397f-4375-965c-4ab9644469f5","Type":"ContainerStarted","Data":"2cf8a9289219cd20f14ee131b941a461b291f0321699554321fc6760d93af54c"} Dec 16 13:09:47 crc kubenswrapper[4757]: I1216 13:09:47.541302 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.541279535 podStartE2EDuration="4.541279535s" podCreationTimestamp="2025-12-16 13:09:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:09:47.528832591 +0000 UTC m=+1372.956576387" watchObservedRunningTime="2025-12-16 13:09:47.541279535 +0000 UTC m=+1372.969023341" Dec 16 13:09:47 crc kubenswrapper[4757]: I1216 13:09:47.685062 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-csx2d"] Dec 16 13:09:47 crc kubenswrapper[4757]: I1216 13:09:47.687363 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csx2d" Dec 16 13:09:47 crc kubenswrapper[4757]: I1216 13:09:47.743149 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-csx2d"] Dec 16 13:09:47 crc kubenswrapper[4757]: I1216 13:09:47.819569 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f892506-3098-432e-94f2-26b4550c3228-utilities\") pod \"redhat-operators-csx2d\" (UID: \"9f892506-3098-432e-94f2-26b4550c3228\") " pod="openshift-marketplace/redhat-operators-csx2d" Dec 16 13:09:47 crc kubenswrapper[4757]: I1216 13:09:47.819650 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f892506-3098-432e-94f2-26b4550c3228-catalog-content\") pod \"redhat-operators-csx2d\" (UID: \"9f892506-3098-432e-94f2-26b4550c3228\") " pod="openshift-marketplace/redhat-operators-csx2d" Dec 16 13:09:47 crc kubenswrapper[4757]: I1216 13:09:47.819752 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvfv6\" (UniqueName: \"kubernetes.io/projected/9f892506-3098-432e-94f2-26b4550c3228-kube-api-access-kvfv6\") pod \"redhat-operators-csx2d\" (UID: \"9f892506-3098-432e-94f2-26b4550c3228\") " pod="openshift-marketplace/redhat-operators-csx2d" Dec 16 13:09:47 crc kubenswrapper[4757]: I1216 13:09:47.921947 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f892506-3098-432e-94f2-26b4550c3228-utilities\") pod \"redhat-operators-csx2d\" (UID: \"9f892506-3098-432e-94f2-26b4550c3228\") " pod="openshift-marketplace/redhat-operators-csx2d" Dec 16 13:09:47 crc kubenswrapper[4757]: I1216 13:09:47.922313 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f892506-3098-432e-94f2-26b4550c3228-catalog-content\") pod \"redhat-operators-csx2d\" (UID: \"9f892506-3098-432e-94f2-26b4550c3228\") " pod="openshift-marketplace/redhat-operators-csx2d" Dec 16 13:09:47 crc kubenswrapper[4757]: I1216 13:09:47.922576 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvfv6\" (UniqueName: \"kubernetes.io/projected/9f892506-3098-432e-94f2-26b4550c3228-kube-api-access-kvfv6\") pod \"redhat-operators-csx2d\" (UID: \"9f892506-3098-432e-94f2-26b4550c3228\") " pod="openshift-marketplace/redhat-operators-csx2d" Dec 16 13:09:47 crc kubenswrapper[4757]: I1216 13:09:47.922606 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f892506-3098-432e-94f2-26b4550c3228-utilities\") pod \"redhat-operators-csx2d\" (UID: \"9f892506-3098-432e-94f2-26b4550c3228\") " pod="openshift-marketplace/redhat-operators-csx2d" Dec 16 13:09:47 crc kubenswrapper[4757]: I1216 13:09:47.922836 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f892506-3098-432e-94f2-26b4550c3228-catalog-content\") pod \"redhat-operators-csx2d\" (UID: \"9f892506-3098-432e-94f2-26b4550c3228\") " pod="openshift-marketplace/redhat-operators-csx2d" Dec 16 13:09:47 crc kubenswrapper[4757]: I1216 13:09:47.939908 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvfv6\" (UniqueName: \"kubernetes.io/projected/9f892506-3098-432e-94f2-26b4550c3228-kube-api-access-kvfv6\") pod \"redhat-operators-csx2d\" (UID: \"9f892506-3098-432e-94f2-26b4550c3228\") " pod="openshift-marketplace/redhat-operators-csx2d" Dec 16 13:09:48 crc kubenswrapper[4757]: I1216 13:09:48.075941 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csx2d" Dec 16 13:09:48 crc kubenswrapper[4757]: I1216 13:09:48.519024 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddca11db-397f-4375-965c-4ab9644469f5","Type":"ContainerStarted","Data":"f86c0908b82f8901d9650fe3f6c40f9b49801fd823ab7e7b27e7a8fb5abd3f40"} Dec 16 13:09:48 crc kubenswrapper[4757]: W1216 13:09:48.590348 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f892506_3098_432e_94f2_26b4550c3228.slice/crio-80240389ef6bb7a56020c02473c85a7f0a9bf731d280447545112ac232d6f9ed WatchSource:0}: Error finding container 80240389ef6bb7a56020c02473c85a7f0a9bf731d280447545112ac232d6f9ed: Status 404 returned error can't find the container with id 80240389ef6bb7a56020c02473c85a7f0a9bf731d280447545112ac232d6f9ed Dec 16 13:09:48 crc kubenswrapper[4757]: I1216 13:09:48.592479 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-csx2d"] Dec 16 13:09:49 crc kubenswrapper[4757]: I1216 13:09:49.533188 4757 generic.go:334] "Generic (PLEG): container finished" podID="9f892506-3098-432e-94f2-26b4550c3228" containerID="5ed96a522a705215642f54acb0486dd4b6a9af81002946a8f1f5ee8a68fd383e" exitCode=0 Dec 16 13:09:49 crc kubenswrapper[4757]: I1216 13:09:49.533276 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csx2d" event={"ID":"9f892506-3098-432e-94f2-26b4550c3228","Type":"ContainerDied","Data":"5ed96a522a705215642f54acb0486dd4b6a9af81002946a8f1f5ee8a68fd383e"} Dec 16 13:09:49 crc kubenswrapper[4757]: I1216 13:09:49.533692 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csx2d" event={"ID":"9f892506-3098-432e-94f2-26b4550c3228","Type":"ContainerStarted","Data":"80240389ef6bb7a56020c02473c85a7f0a9bf731d280447545112ac232d6f9ed"} Dec 16 13:09:49 crc kubenswrapper[4757]: I1216 13:09:49.536955 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddca11db-397f-4375-965c-4ab9644469f5","Type":"ContainerStarted","Data":"2af665ab9c7b64e5c5dd789906b7912141a5b33a0282f1d558aa0c9c07a56791"} Dec 16 13:09:50 crc kubenswrapper[4757]: I1216 13:09:50.548769 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddca11db-397f-4375-965c-4ab9644469f5","Type":"ContainerStarted","Data":"e7b4d09a7468df03bd152559d7e6ca9742634080e882af18edd905c7d8e629b8"} Dec 16 13:09:50 crc kubenswrapper[4757]: I1216 13:09:50.746726 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 13:09:50 crc kubenswrapper[4757]: I1216 13:09:50.747353 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 13:09:50 crc kubenswrapper[4757]: I1216 13:09:50.780233 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 13:09:50 crc kubenswrapper[4757]: I1216 13:09:50.794420 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 13:09:51 crc kubenswrapper[4757]: I1216 13:09:51.560163 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csx2d" event={"ID":"9f892506-3098-432e-94f2-26b4550c3228","Type":"ContainerStarted","Data":"aa91599c7e57e3d6ec3ea7eaf2065522304fcdbdf4787b19d59e94ae641f05d4"} Dec 16 13:09:51 crc kubenswrapper[4757]: I1216 13:09:51.561982 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 13:09:51 crc kubenswrapper[4757]: I1216 13:09:51.562029 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 13:09:52 crc kubenswrapper[4757]: I1216 13:09:52.571683 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddca11db-397f-4375-965c-4ab9644469f5","Type":"ContainerStarted","Data":"90ff743035787a1cc9ea0276eda2248d6d9332d554b0c014842bdd074429f7d4"} Dec 16 13:09:52 crc kubenswrapper[4757]: I1216 13:09:52.603317 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.859603306 podStartE2EDuration="7.603295268s" podCreationTimestamp="2025-12-16 13:09:45 +0000 UTC" firstStartedPulling="2025-12-16 13:09:46.806879318 +0000 UTC m=+1372.234623114" lastFinishedPulling="2025-12-16 13:09:51.55057128 +0000 UTC m=+1376.978315076" observedRunningTime="2025-12-16 13:09:52.597571318 +0000 UTC m=+1378.025315114" watchObservedRunningTime="2025-12-16 13:09:52.603295268 +0000 UTC m=+1378.031039064" Dec 16 13:09:53 crc kubenswrapper[4757]: I1216 13:09:53.580276 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 13:09:53 crc kubenswrapper[4757]: I1216 13:09:53.784675 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 13:09:53 crc kubenswrapper[4757]: I1216 13:09:53.784764 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 13:09:53 crc kubenswrapper[4757]: I1216 13:09:53.841101 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 13:09:53 crc kubenswrapper[4757]: I1216 13:09:53.843764 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 13:09:54 crc kubenswrapper[4757]: I1216 13:09:54.588493 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 13:09:54 crc kubenswrapper[4757]: I1216 13:09:54.588556 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 13:09:55 crc kubenswrapper[4757]: I1216 13:09:55.598199 4757 generic.go:334] "Generic (PLEG): container finished" podID="9f892506-3098-432e-94f2-26b4550c3228" containerID="aa91599c7e57e3d6ec3ea7eaf2065522304fcdbdf4787b19d59e94ae641f05d4" exitCode=0 Dec 16 13:09:55 crc kubenswrapper[4757]: I1216 13:09:55.598281 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csx2d" event={"ID":"9f892506-3098-432e-94f2-26b4550c3228","Type":"ContainerDied","Data":"aa91599c7e57e3d6ec3ea7eaf2065522304fcdbdf4787b19d59e94ae641f05d4"} Dec 16 13:09:56 crc kubenswrapper[4757]: I1216 13:09:56.647888 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 13:09:56 crc kubenswrapper[4757]: I1216 13:09:56.648325 4757 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 13:09:56 crc kubenswrapper[4757]: I1216 13:09:56.649117 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 13:09:57 crc kubenswrapper[4757]: I1216 13:09:57.625310 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csx2d" event={"ID":"9f892506-3098-432e-94f2-26b4550c3228","Type":"ContainerStarted","Data":"b9876f379451ba2d2589b160b5972b1d45b7fd77f3c90f275c0dbec28507f2a4"} Dec 16 13:09:57 crc kubenswrapper[4757]: I1216 13:09:57.678793 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-csx2d" podStartSLOduration=3.684100892 podStartE2EDuration="10.67876724s" podCreationTimestamp="2025-12-16 13:09:47 +0000 UTC" firstStartedPulling="2025-12-16 13:09:49.535621845 +0000 UTC m=+1374.963365641" lastFinishedPulling="2025-12-16 13:09:56.530288193 +0000 UTC m=+1381.958031989" observedRunningTime="2025-12-16 13:09:57.677834117 +0000 UTC m=+1383.105577923" watchObservedRunningTime="2025-12-16 13:09:57.67876724 +0000 UTC m=+1383.106511036" Dec 16 13:09:58 crc kubenswrapper[4757]: I1216 13:09:58.076495 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-csx2d" Dec 16 13:09:58 crc kubenswrapper[4757]: I1216 13:09:58.076786 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-csx2d" Dec 16 13:09:59 crc kubenswrapper[4757]: I1216 13:09:59.274160 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 13:09:59 crc kubenswrapper[4757]: I1216 13:09:59.275746 4757 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 13:09:59 crc kubenswrapper[4757]: I1216 13:09:59.646341 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 13:09:59 crc kubenswrapper[4757]: I1216 13:09:59.705626 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-csx2d" podUID="9f892506-3098-432e-94f2-26b4550c3228" containerName="registry-server" probeResult="failure" output=< Dec 16 13:09:59 crc kubenswrapper[4757]: timeout: failed to connect service ":50051" within 1s Dec 16 13:09:59 crc kubenswrapper[4757]: > Dec 16 13:10:01 crc kubenswrapper[4757]: I1216 13:10:01.728603 4757 generic.go:334] "Generic (PLEG): container finished" podID="65337bd1-c674-4817-91c2-ad150639205c" containerID="8f67125ddbcaa77814e69db0c0e1c32f2f3b513d08902bb37e5163bedc8c2aef" exitCode=137 Dec 16 13:10:01 crc kubenswrapper[4757]: I1216 13:10:01.728645 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d66ddf65b-lmltr" event={"ID":"65337bd1-c674-4817-91c2-ad150639205c","Type":"ContainerDied","Data":"8f67125ddbcaa77814e69db0c0e1c32f2f3b513d08902bb37e5163bedc8c2aef"} Dec 16 13:10:01 crc kubenswrapper[4757]: I1216 13:10:01.729099 4757 scope.go:117] "RemoveContainer" containerID="357eee533356136a47d50ddfe4b10cb06996ef66793b34da2123f1bd22018055" Dec 16 13:10:02 crc kubenswrapper[4757]: I1216 13:10:02.739291 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d66ddf65b-lmltr" event={"ID":"65337bd1-c674-4817-91c2-ad150639205c","Type":"ContainerStarted","Data":"d4b24c35d47929f24f89e82542a28f81ca685428b409bf31f18a4c0078faf2b7"} Dec 16 13:10:02 crc kubenswrapper[4757]: I1216 13:10:02.743778 4757 generic.go:334] "Generic (PLEG): container finished" podID="399f2693-64b1-4958-ad75-49c45b448ed5" containerID="bc30dc5c1ec1bde48a9125e46de96e8a257135a5fb2342aa9590cfa8754f8773" exitCode=137 Dec 16 13:10:02 crc kubenswrapper[4757]: I1216 13:10:02.743816 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75ccc7d896-jmrk9" event={"ID":"399f2693-64b1-4958-ad75-49c45b448ed5","Type":"ContainerDied","Data":"bc30dc5c1ec1bde48a9125e46de96e8a257135a5fb2342aa9590cfa8754f8773"} Dec 16 13:10:02 crc kubenswrapper[4757]: I1216 13:10:02.743851 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75ccc7d896-jmrk9" event={"ID":"399f2693-64b1-4958-ad75-49c45b448ed5","Type":"ContainerStarted","Data":"c796a6497876b25321bec8486b437aa7c77c35a00f83ae805bc05496ed54f2cc"} Dec 16 13:10:02 crc kubenswrapper[4757]: I1216 13:10:02.743869 4757 scope.go:117] "RemoveContainer" containerID="25ab2be0f535088a98cd5974a570660c2b1ab7f032761874bdf1659a40210f03" Dec 16 13:10:04 crc kubenswrapper[4757]: I1216 13:10:04.769551 4757 generic.go:334] "Generic (PLEG): container finished" podID="8a6fdfdb-145b-460e-b8e9-9f44e9034f40" containerID="4c2346231f5aa5a3f75291af9b526f1c18d54ecfa10f7c241ed4613de9179a22" exitCode=0 Dec 16 13:10:04 crc kubenswrapper[4757]: I1216 13:10:04.769783 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bxc8x" event={"ID":"8a6fdfdb-145b-460e-b8e9-9f44e9034f40","Type":"ContainerDied","Data":"4c2346231f5aa5a3f75291af9b526f1c18d54ecfa10f7c241ed4613de9179a22"} Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.243074 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bxc8x" Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.317639 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-config-data\") pod \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\" (UID: \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\") " Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.317777 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-combined-ca-bundle\") pod \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\" (UID: \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\") " Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.317994 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-scripts\") pod \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\" (UID: \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\") " Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.318089 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvpnq\" (UniqueName: \"kubernetes.io/projected/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-kube-api-access-zvpnq\") pod \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\" (UID: \"8a6fdfdb-145b-460e-b8e9-9f44e9034f40\") " Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.335699 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-scripts" (OuterVolumeSpecName: "scripts") pod "8a6fdfdb-145b-460e-b8e9-9f44e9034f40" (UID: "8a6fdfdb-145b-460e-b8e9-9f44e9034f40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.335755 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-kube-api-access-zvpnq" (OuterVolumeSpecName: "kube-api-access-zvpnq") pod "8a6fdfdb-145b-460e-b8e9-9f44e9034f40" (UID: "8a6fdfdb-145b-460e-b8e9-9f44e9034f40"). InnerVolumeSpecName "kube-api-access-zvpnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.356795 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-config-data" (OuterVolumeSpecName: "config-data") pod "8a6fdfdb-145b-460e-b8e9-9f44e9034f40" (UID: "8a6fdfdb-145b-460e-b8e9-9f44e9034f40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.358306 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a6fdfdb-145b-460e-b8e9-9f44e9034f40" (UID: "8a6fdfdb-145b-460e-b8e9-9f44e9034f40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.420211 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.420295 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.420312 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.420335 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvpnq\" (UniqueName: \"kubernetes.io/projected/8a6fdfdb-145b-460e-b8e9-9f44e9034f40-kube-api-access-zvpnq\") on node \"crc\" DevicePath \"\"" Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.821309 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bxc8x" event={"ID":"8a6fdfdb-145b-460e-b8e9-9f44e9034f40","Type":"ContainerDied","Data":"8244939b2a59a8a775a360ff6be86e4523e10a373787b2a7a70671dad3a40a7b"} Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.821358 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8244939b2a59a8a775a360ff6be86e4523e10a373787b2a7a70671dad3a40a7b" Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.821387 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bxc8x" Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.982080 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 13:10:06 crc kubenswrapper[4757]: E1216 13:10:06.982771 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a6fdfdb-145b-460e-b8e9-9f44e9034f40" containerName="nova-cell0-conductor-db-sync" Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.982868 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a6fdfdb-145b-460e-b8e9-9f44e9034f40" containerName="nova-cell0-conductor-db-sync" Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.983186 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a6fdfdb-145b-460e-b8e9-9f44e9034f40" containerName="nova-cell0-conductor-db-sync" Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.983992 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.988397 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 16 13:10:06 crc kubenswrapper[4757]: I1216 13:10:06.989138 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-55n4r" Dec 16 13:10:07 crc kubenswrapper[4757]: I1216 13:10:07.020379 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 13:10:07 crc kubenswrapper[4757]: I1216 13:10:07.135353 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/517ce20e-e28f-46f9-a2c9-a2fbe7020b15-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"517ce20e-e28f-46f9-a2c9-a2fbe7020b15\") " pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:07 crc kubenswrapper[4757]: I1216 13:10:07.135472 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czlgt\" (UniqueName: \"kubernetes.io/projected/517ce20e-e28f-46f9-a2c9-a2fbe7020b15-kube-api-access-czlgt\") pod \"nova-cell0-conductor-0\" (UID: \"517ce20e-e28f-46f9-a2c9-a2fbe7020b15\") " pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:07 crc kubenswrapper[4757]: I1216 13:10:07.135554 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/517ce20e-e28f-46f9-a2c9-a2fbe7020b15-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"517ce20e-e28f-46f9-a2c9-a2fbe7020b15\") " pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:07 crc kubenswrapper[4757]: I1216 13:10:07.236845 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czlgt\" (UniqueName: \"kubernetes.io/projected/517ce20e-e28f-46f9-a2c9-a2fbe7020b15-kube-api-access-czlgt\") pod \"nova-cell0-conductor-0\" (UID: \"517ce20e-e28f-46f9-a2c9-a2fbe7020b15\") " pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:07 crc kubenswrapper[4757]: I1216 13:10:07.236947 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/517ce20e-e28f-46f9-a2c9-a2fbe7020b15-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"517ce20e-e28f-46f9-a2c9-a2fbe7020b15\") " pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:07 crc kubenswrapper[4757]: I1216 13:10:07.237110 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/517ce20e-e28f-46f9-a2c9-a2fbe7020b15-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"517ce20e-e28f-46f9-a2c9-a2fbe7020b15\") " pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:07 crc kubenswrapper[4757]: I1216 13:10:07.240805 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/517ce20e-e28f-46f9-a2c9-a2fbe7020b15-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"517ce20e-e28f-46f9-a2c9-a2fbe7020b15\") " pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:07 crc kubenswrapper[4757]: I1216 13:10:07.241046 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/517ce20e-e28f-46f9-a2c9-a2fbe7020b15-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"517ce20e-e28f-46f9-a2c9-a2fbe7020b15\") " pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:07 crc kubenswrapper[4757]: I1216 13:10:07.260065 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czlgt\" (UniqueName: \"kubernetes.io/projected/517ce20e-e28f-46f9-a2c9-a2fbe7020b15-kube-api-access-czlgt\") pod \"nova-cell0-conductor-0\" (UID: \"517ce20e-e28f-46f9-a2c9-a2fbe7020b15\") " pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:07 crc kubenswrapper[4757]: I1216 13:10:07.303266 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:07 crc kubenswrapper[4757]: I1216 13:10:07.881740 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 13:10:07 crc kubenswrapper[4757]: W1216 13:10:07.893401 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod517ce20e_e28f_46f9_a2c9_a2fbe7020b15.slice/crio-687dbab5108e8dabeeae8912ca9551a12953d5a675b8be572aecfac619492b53 WatchSource:0}: Error finding container 687dbab5108e8dabeeae8912ca9551a12953d5a675b8be572aecfac619492b53: Status 404 returned error can't find the container with id 687dbab5108e8dabeeae8912ca9551a12953d5a675b8be572aecfac619492b53 Dec 16 13:10:08 crc kubenswrapper[4757]: I1216 13:10:08.141969 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-csx2d" Dec 16 13:10:08 crc kubenswrapper[4757]: I1216 13:10:08.208946 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-csx2d" Dec 16 13:10:08 crc kubenswrapper[4757]: I1216 13:10:08.396496 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-csx2d"] Dec 16 13:10:08 crc kubenswrapper[4757]: I1216 13:10:08.843850 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"517ce20e-e28f-46f9-a2c9-a2fbe7020b15","Type":"ContainerStarted","Data":"0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0"} Dec 16 13:10:08 crc kubenswrapper[4757]: I1216 13:10:08.843923 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"517ce20e-e28f-46f9-a2c9-a2fbe7020b15","Type":"ContainerStarted","Data":"687dbab5108e8dabeeae8912ca9551a12953d5a675b8be572aecfac619492b53"} Dec 16 13:10:08 crc kubenswrapper[4757]: I1216 13:10:08.844705 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:08 crc kubenswrapper[4757]: I1216 13:10:08.868417 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.868391796 podStartE2EDuration="2.868391796s" podCreationTimestamp="2025-12-16 13:10:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:10:08.864264844 +0000 UTC m=+1394.292008640" watchObservedRunningTime="2025-12-16 13:10:08.868391796 +0000 UTC m=+1394.296135602" Dec 16 13:10:09 crc kubenswrapper[4757]: I1216 13:10:09.774787 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 13:10:09 crc kubenswrapper[4757]: I1216 13:10:09.867290 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-csx2d" podUID="9f892506-3098-432e-94f2-26b4550c3228" containerName="registry-server" containerID="cri-o://b9876f379451ba2d2589b160b5972b1d45b7fd77f3c90f275c0dbec28507f2a4" gracePeriod=2 Dec 16 13:10:10 crc kubenswrapper[4757]: I1216 13:10:10.664648 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csx2d" Dec 16 13:10:10 crc kubenswrapper[4757]: I1216 13:10:10.823959 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvfv6\" (UniqueName: \"kubernetes.io/projected/9f892506-3098-432e-94f2-26b4550c3228-kube-api-access-kvfv6\") pod \"9f892506-3098-432e-94f2-26b4550c3228\" (UID: \"9f892506-3098-432e-94f2-26b4550c3228\") " Dec 16 13:10:10 crc kubenswrapper[4757]: I1216 13:10:10.824060 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f892506-3098-432e-94f2-26b4550c3228-utilities\") pod \"9f892506-3098-432e-94f2-26b4550c3228\" (UID: \"9f892506-3098-432e-94f2-26b4550c3228\") " Dec 16 13:10:10 crc kubenswrapper[4757]: I1216 13:10:10.824215 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f892506-3098-432e-94f2-26b4550c3228-catalog-content\") pod \"9f892506-3098-432e-94f2-26b4550c3228\" (UID: \"9f892506-3098-432e-94f2-26b4550c3228\") " Dec 16 13:10:10 crc kubenswrapper[4757]: I1216 13:10:10.825296 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f892506-3098-432e-94f2-26b4550c3228-utilities" (OuterVolumeSpecName: "utilities") pod "9f892506-3098-432e-94f2-26b4550c3228" (UID: "9f892506-3098-432e-94f2-26b4550c3228"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:10:10 crc kubenswrapper[4757]: I1216 13:10:10.839796 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f892506-3098-432e-94f2-26b4550c3228-kube-api-access-kvfv6" (OuterVolumeSpecName: "kube-api-access-kvfv6") pod "9f892506-3098-432e-94f2-26b4550c3228" (UID: "9f892506-3098-432e-94f2-26b4550c3228"). InnerVolumeSpecName "kube-api-access-kvfv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:10:10 crc kubenswrapper[4757]: I1216 13:10:10.908471 4757 generic.go:334] "Generic (PLEG): container finished" podID="9f892506-3098-432e-94f2-26b4550c3228" containerID="b9876f379451ba2d2589b160b5972b1d45b7fd77f3c90f275c0dbec28507f2a4" exitCode=0 Dec 16 13:10:10 crc kubenswrapper[4757]: I1216 13:10:10.908639 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="517ce20e-e28f-46f9-a2c9-a2fbe7020b15" containerName="nova-cell0-conductor-conductor" containerID="cri-o://0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0" gracePeriod=30 Dec 16 13:10:10 crc kubenswrapper[4757]: I1216 13:10:10.908974 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csx2d" Dec 16 13:10:10 crc kubenswrapper[4757]: I1216 13:10:10.909380 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csx2d" event={"ID":"9f892506-3098-432e-94f2-26b4550c3228","Type":"ContainerDied","Data":"b9876f379451ba2d2589b160b5972b1d45b7fd77f3c90f275c0dbec28507f2a4"} Dec 16 13:10:10 crc kubenswrapper[4757]: I1216 13:10:10.909408 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csx2d" event={"ID":"9f892506-3098-432e-94f2-26b4550c3228","Type":"ContainerDied","Data":"80240389ef6bb7a56020c02473c85a7f0a9bf731d280447545112ac232d6f9ed"} Dec 16 13:10:10 crc kubenswrapper[4757]: I1216 13:10:10.909424 4757 scope.go:117] "RemoveContainer" containerID="b9876f379451ba2d2589b160b5972b1d45b7fd77f3c90f275c0dbec28507f2a4" Dec 16 13:10:10 crc kubenswrapper[4757]: I1216 13:10:10.926481 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvfv6\" (UniqueName: \"kubernetes.io/projected/9f892506-3098-432e-94f2-26b4550c3228-kube-api-access-kvfv6\") on node \"crc\" DevicePath \"\"" Dec 16 13:10:10 crc kubenswrapper[4757]: I1216 13:10:10.926523 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f892506-3098-432e-94f2-26b4550c3228-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:10:10 crc kubenswrapper[4757]: I1216 13:10:10.998191 4757 scope.go:117] "RemoveContainer" containerID="aa91599c7e57e3d6ec3ea7eaf2065522304fcdbdf4787b19d59e94ae641f05d4" Dec 16 13:10:11 crc kubenswrapper[4757]: I1216 13:10:11.047170 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f892506-3098-432e-94f2-26b4550c3228-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f892506-3098-432e-94f2-26b4550c3228" (UID: "9f892506-3098-432e-94f2-26b4550c3228"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:10:11 crc kubenswrapper[4757]: I1216 13:10:11.085076 4757 scope.go:117] "RemoveContainer" containerID="5ed96a522a705215642f54acb0486dd4b6a9af81002946a8f1f5ee8a68fd383e" Dec 16 13:10:11 crc kubenswrapper[4757]: I1216 13:10:11.142942 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f892506-3098-432e-94f2-26b4550c3228-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:10:11 crc kubenswrapper[4757]: I1216 13:10:11.179911 4757 scope.go:117] "RemoveContainer" containerID="b9876f379451ba2d2589b160b5972b1d45b7fd77f3c90f275c0dbec28507f2a4" Dec 16 13:10:11 crc kubenswrapper[4757]: E1216 13:10:11.182367 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9876f379451ba2d2589b160b5972b1d45b7fd77f3c90f275c0dbec28507f2a4\": container with ID starting with b9876f379451ba2d2589b160b5972b1d45b7fd77f3c90f275c0dbec28507f2a4 not found: ID does not exist" containerID="b9876f379451ba2d2589b160b5972b1d45b7fd77f3c90f275c0dbec28507f2a4" Dec 16 13:10:11 crc kubenswrapper[4757]: I1216 13:10:11.182416 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9876f379451ba2d2589b160b5972b1d45b7fd77f3c90f275c0dbec28507f2a4"} err="failed to get container status \"b9876f379451ba2d2589b160b5972b1d45b7fd77f3c90f275c0dbec28507f2a4\": rpc error: code = NotFound desc = could not find container \"b9876f379451ba2d2589b160b5972b1d45b7fd77f3c90f275c0dbec28507f2a4\": container with ID starting with b9876f379451ba2d2589b160b5972b1d45b7fd77f3c90f275c0dbec28507f2a4 not found: ID does not exist" Dec 16 13:10:11 crc kubenswrapper[4757]: I1216 13:10:11.182443 4757 scope.go:117] "RemoveContainer" containerID="aa91599c7e57e3d6ec3ea7eaf2065522304fcdbdf4787b19d59e94ae641f05d4" Dec 16 13:10:11 crc kubenswrapper[4757]: E1216 13:10:11.183374 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa91599c7e57e3d6ec3ea7eaf2065522304fcdbdf4787b19d59e94ae641f05d4\": container with ID starting with aa91599c7e57e3d6ec3ea7eaf2065522304fcdbdf4787b19d59e94ae641f05d4 not found: ID does not exist" containerID="aa91599c7e57e3d6ec3ea7eaf2065522304fcdbdf4787b19d59e94ae641f05d4" Dec 16 13:10:11 crc kubenswrapper[4757]: I1216 13:10:11.183402 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa91599c7e57e3d6ec3ea7eaf2065522304fcdbdf4787b19d59e94ae641f05d4"} err="failed to get container status \"aa91599c7e57e3d6ec3ea7eaf2065522304fcdbdf4787b19d59e94ae641f05d4\": rpc error: code = NotFound desc = could not find container \"aa91599c7e57e3d6ec3ea7eaf2065522304fcdbdf4787b19d59e94ae641f05d4\": container with ID starting with aa91599c7e57e3d6ec3ea7eaf2065522304fcdbdf4787b19d59e94ae641f05d4 not found: ID does not exist" Dec 16 13:10:11 crc kubenswrapper[4757]: I1216 13:10:11.183418 4757 scope.go:117] "RemoveContainer" containerID="5ed96a522a705215642f54acb0486dd4b6a9af81002946a8f1f5ee8a68fd383e" Dec 16 13:10:11 crc kubenswrapper[4757]: E1216 13:10:11.184634 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ed96a522a705215642f54acb0486dd4b6a9af81002946a8f1f5ee8a68fd383e\": container with ID starting with 5ed96a522a705215642f54acb0486dd4b6a9af81002946a8f1f5ee8a68fd383e not found: ID does not exist" containerID="5ed96a522a705215642f54acb0486dd4b6a9af81002946a8f1f5ee8a68fd383e" Dec 16 13:10:11 crc kubenswrapper[4757]: I1216 13:10:11.184670 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed96a522a705215642f54acb0486dd4b6a9af81002946a8f1f5ee8a68fd383e"} err="failed to get container status \"5ed96a522a705215642f54acb0486dd4b6a9af81002946a8f1f5ee8a68fd383e\": rpc error: code = NotFound desc = could not find container \"5ed96a522a705215642f54acb0486dd4b6a9af81002946a8f1f5ee8a68fd383e\": container with ID starting with 5ed96a522a705215642f54acb0486dd4b6a9af81002946a8f1f5ee8a68fd383e not found: ID does not exist" Dec 16 13:10:11 crc kubenswrapper[4757]: I1216 13:10:11.262278 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-csx2d"] Dec 16 13:10:11 crc kubenswrapper[4757]: I1216 13:10:11.280164 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-csx2d"] Dec 16 13:10:11 crc kubenswrapper[4757]: I1216 13:10:11.467786 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:10:11 crc kubenswrapper[4757]: I1216 13:10:11.468382 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:10:11 crc kubenswrapper[4757]: I1216 13:10:11.582316 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:10:11 crc kubenswrapper[4757]: I1216 13:10:11.582363 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:10:12 crc kubenswrapper[4757]: I1216 13:10:12.959790 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f892506-3098-432e-94f2-26b4550c3228" path="/var/lib/kubelet/pods/9f892506-3098-432e-94f2-26b4550c3228/volumes" Dec 16 13:10:13 crc kubenswrapper[4757]: I1216 13:10:13.660264 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:10:13 crc kubenswrapper[4757]: I1216 13:10:13.660911 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ddca11db-397f-4375-965c-4ab9644469f5" containerName="ceilometer-central-agent" containerID="cri-o://f86c0908b82f8901d9650fe3f6c40f9b49801fd823ab7e7b27e7a8fb5abd3f40" gracePeriod=30 Dec 16 13:10:13 crc kubenswrapper[4757]: I1216 13:10:13.661048 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ddca11db-397f-4375-965c-4ab9644469f5" containerName="sg-core" containerID="cri-o://e7b4d09a7468df03bd152559d7e6ca9742634080e882af18edd905c7d8e629b8" gracePeriod=30 Dec 16 13:10:13 crc kubenswrapper[4757]: I1216 13:10:13.661082 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ddca11db-397f-4375-965c-4ab9644469f5" containerName="proxy-httpd" containerID="cri-o://90ff743035787a1cc9ea0276eda2248d6d9332d554b0c014842bdd074429f7d4" gracePeriod=30 Dec 16 13:10:13 crc kubenswrapper[4757]: I1216 13:10:13.661140 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ddca11db-397f-4375-965c-4ab9644469f5" containerName="ceilometer-notification-agent" containerID="cri-o://2af665ab9c7b64e5c5dd789906b7912141a5b33a0282f1d558aa0c9c07a56791" gracePeriod=30 Dec 16 13:10:13 crc kubenswrapper[4757]: I1216 13:10:13.716510 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ddca11db-397f-4375-965c-4ab9644469f5" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 16 13:10:13 crc kubenswrapper[4757]: I1216 13:10:13.939486 4757 generic.go:334] "Generic (PLEG): container finished" podID="ddca11db-397f-4375-965c-4ab9644469f5" containerID="90ff743035787a1cc9ea0276eda2248d6d9332d554b0c014842bdd074429f7d4" exitCode=0 Dec 16 13:10:13 crc kubenswrapper[4757]: I1216 13:10:13.939518 4757 generic.go:334] "Generic (PLEG): container finished" podID="ddca11db-397f-4375-965c-4ab9644469f5" containerID="e7b4d09a7468df03bd152559d7e6ca9742634080e882af18edd905c7d8e629b8" exitCode=2 Dec 16 13:10:13 crc kubenswrapper[4757]: I1216 13:10:13.939533 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddca11db-397f-4375-965c-4ab9644469f5","Type":"ContainerDied","Data":"90ff743035787a1cc9ea0276eda2248d6d9332d554b0c014842bdd074429f7d4"} Dec 16 13:10:13 crc kubenswrapper[4757]: I1216 13:10:13.939589 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddca11db-397f-4375-965c-4ab9644469f5","Type":"ContainerDied","Data":"e7b4d09a7468df03bd152559d7e6ca9742634080e882af18edd905c7d8e629b8"} Dec 16 13:10:14 crc kubenswrapper[4757]: I1216 13:10:14.954111 4757 generic.go:334] "Generic (PLEG): container finished" podID="ddca11db-397f-4375-965c-4ab9644469f5" containerID="f86c0908b82f8901d9650fe3f6c40f9b49801fd823ab7e7b27e7a8fb5abd3f40" exitCode=0 Dec 16 13:10:14 crc kubenswrapper[4757]: I1216 13:10:14.961276 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddca11db-397f-4375-965c-4ab9644469f5","Type":"ContainerDied","Data":"f86c0908b82f8901d9650fe3f6c40f9b49801fd823ab7e7b27e7a8fb5abd3f40"} Dec 16 13:10:16 crc kubenswrapper[4757]: I1216 13:10:16.213549 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ddca11db-397f-4375-965c-4ab9644469f5" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.181:3000/\": dial tcp 10.217.0.181:3000: connect: connection refused" Dec 16 13:10:17 crc kubenswrapper[4757]: E1216 13:10:17.307623 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 13:10:17 crc kubenswrapper[4757]: E1216 13:10:17.310295 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 13:10:17 crc kubenswrapper[4757]: E1216 13:10:17.311664 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 13:10:17 crc kubenswrapper[4757]: E1216 13:10:17.311732 4757 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="517ce20e-e28f-46f9-a2c9-a2fbe7020b15" containerName="nova-cell0-conductor-conductor" Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.038652 4757 generic.go:334] "Generic (PLEG): container finished" podID="ddca11db-397f-4375-965c-4ab9644469f5" containerID="2af665ab9c7b64e5c5dd789906b7912141a5b33a0282f1d558aa0c9c07a56791" exitCode=0 Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.038722 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddca11db-397f-4375-965c-4ab9644469f5","Type":"ContainerDied","Data":"2af665ab9c7b64e5c5dd789906b7912141a5b33a0282f1d558aa0c9c07a56791"} Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.704137 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.860299 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddca11db-397f-4375-965c-4ab9644469f5-run-httpd\") pod \"ddca11db-397f-4375-965c-4ab9644469f5\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.860446 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddca11db-397f-4375-965c-4ab9644469f5-log-httpd\") pod \"ddca11db-397f-4375-965c-4ab9644469f5\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.860484 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-sg-core-conf-yaml\") pod \"ddca11db-397f-4375-965c-4ab9644469f5\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.860519 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-scripts\") pod \"ddca11db-397f-4375-965c-4ab9644469f5\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.860567 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-combined-ca-bundle\") pod \"ddca11db-397f-4375-965c-4ab9644469f5\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.860588 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-config-data\") pod \"ddca11db-397f-4375-965c-4ab9644469f5\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.860659 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb5g8\" (UniqueName: \"kubernetes.io/projected/ddca11db-397f-4375-965c-4ab9644469f5-kube-api-access-jb5g8\") pod \"ddca11db-397f-4375-965c-4ab9644469f5\" (UID: \"ddca11db-397f-4375-965c-4ab9644469f5\") " Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.863273 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddca11db-397f-4375-965c-4ab9644469f5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ddca11db-397f-4375-965c-4ab9644469f5" (UID: "ddca11db-397f-4375-965c-4ab9644469f5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.863394 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddca11db-397f-4375-965c-4ab9644469f5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ddca11db-397f-4375-965c-4ab9644469f5" (UID: "ddca11db-397f-4375-965c-4ab9644469f5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.897785 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddca11db-397f-4375-965c-4ab9644469f5-kube-api-access-jb5g8" (OuterVolumeSpecName: "kube-api-access-jb5g8") pod "ddca11db-397f-4375-965c-4ab9644469f5" (UID: "ddca11db-397f-4375-965c-4ab9644469f5"). InnerVolumeSpecName "kube-api-access-jb5g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.897835 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-scripts" (OuterVolumeSpecName: "scripts") pod "ddca11db-397f-4375-965c-4ab9644469f5" (UID: "ddca11db-397f-4375-965c-4ab9644469f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.919840 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ddca11db-397f-4375-965c-4ab9644469f5" (UID: "ddca11db-397f-4375-965c-4ab9644469f5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.994561 4757 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddca11db-397f-4375-965c-4ab9644469f5-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.994603 4757 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.994616 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.994627 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb5g8\" (UniqueName: \"kubernetes.io/projected/ddca11db-397f-4375-965c-4ab9644469f5-kube-api-access-jb5g8\") on node \"crc\" DevicePath \"\"" Dec 16 13:10:19 crc kubenswrapper[4757]: I1216 13:10:19.994639 4757 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ddca11db-397f-4375-965c-4ab9644469f5-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.052660 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ddca11db-397f-4375-965c-4ab9644469f5","Type":"ContainerDied","Data":"2cf8a9289219cd20f14ee131b941a461b291f0321699554321fc6760d93af54c"} Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.052711 4757 scope.go:117] "RemoveContainer" containerID="90ff743035787a1cc9ea0276eda2248d6d9332d554b0c014842bdd074429f7d4" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.052856 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.131301 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddca11db-397f-4375-965c-4ab9644469f5" (UID: "ddca11db-397f-4375-965c-4ab9644469f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.138380 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-config-data" (OuterVolumeSpecName: "config-data") pod "ddca11db-397f-4375-965c-4ab9644469f5" (UID: "ddca11db-397f-4375-965c-4ab9644469f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.203585 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.203637 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddca11db-397f-4375-965c-4ab9644469f5-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.236946 4757 scope.go:117] "RemoveContainer" containerID="e7b4d09a7468df03bd152559d7e6ca9742634080e882af18edd905c7d8e629b8" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.286563 4757 scope.go:117] "RemoveContainer" containerID="2af665ab9c7b64e5c5dd789906b7912141a5b33a0282f1d558aa0c9c07a56791" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.309687 4757 scope.go:117] "RemoveContainer" containerID="f86c0908b82f8901d9650fe3f6c40f9b49801fd823ab7e7b27e7a8fb5abd3f40" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.386515 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.395939 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.437302 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:10:20 crc kubenswrapper[4757]: E1216 13:10:20.437725 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f892506-3098-432e-94f2-26b4550c3228" containerName="registry-server" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.437744 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f892506-3098-432e-94f2-26b4550c3228" containerName="registry-server" Dec 16 13:10:20 crc kubenswrapper[4757]: E1216 13:10:20.437767 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddca11db-397f-4375-965c-4ab9644469f5" containerName="proxy-httpd" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.437775 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddca11db-397f-4375-965c-4ab9644469f5" containerName="proxy-httpd" Dec 16 13:10:20 crc kubenswrapper[4757]: E1216 13:10:20.437789 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f892506-3098-432e-94f2-26b4550c3228" containerName="extract-content" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.437796 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f892506-3098-432e-94f2-26b4550c3228" containerName="extract-content" Dec 16 13:10:20 crc kubenswrapper[4757]: E1216 13:10:20.437816 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddca11db-397f-4375-965c-4ab9644469f5" containerName="ceilometer-central-agent" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.437823 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddca11db-397f-4375-965c-4ab9644469f5" containerName="ceilometer-central-agent" Dec 16 13:10:20 crc kubenswrapper[4757]: E1216 13:10:20.437832 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f892506-3098-432e-94f2-26b4550c3228" containerName="extract-utilities" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.437840 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f892506-3098-432e-94f2-26b4550c3228" containerName="extract-utilities" Dec 16 13:10:20 crc kubenswrapper[4757]: E1216 13:10:20.437858 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddca11db-397f-4375-965c-4ab9644469f5" containerName="ceilometer-notification-agent" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.437865 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddca11db-397f-4375-965c-4ab9644469f5" containerName="ceilometer-notification-agent" Dec 16 13:10:20 crc kubenswrapper[4757]: E1216 13:10:20.437876 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddca11db-397f-4375-965c-4ab9644469f5" containerName="sg-core" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.437883 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddca11db-397f-4375-965c-4ab9644469f5" containerName="sg-core" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.438221 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f892506-3098-432e-94f2-26b4550c3228" containerName="registry-server" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.438234 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddca11db-397f-4375-965c-4ab9644469f5" containerName="ceilometer-notification-agent" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.438246 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddca11db-397f-4375-965c-4ab9644469f5" containerName="proxy-httpd" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.438257 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddca11db-397f-4375-965c-4ab9644469f5" containerName="sg-core" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.438284 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddca11db-397f-4375-965c-4ab9644469f5" containerName="ceilometer-central-agent" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.440382 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.445719 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.454900 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.474157 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.507757 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.507806 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-config-data\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.507837 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-scripts\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.507881 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02eab243-e69e-4823-bc6f-2e6b70d5c80d-log-httpd\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.507925 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02eab243-e69e-4823-bc6f-2e6b70d5c80d-run-httpd\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.507970 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.508161 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7g7c\" (UniqueName: \"kubernetes.io/projected/02eab243-e69e-4823-bc6f-2e6b70d5c80d-kube-api-access-c7g7c\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.609513 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02eab243-e69e-4823-bc6f-2e6b70d5c80d-run-httpd\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.609594 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.609709 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7g7c\" (UniqueName: \"kubernetes.io/projected/02eab243-e69e-4823-bc6f-2e6b70d5c80d-kube-api-access-c7g7c\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.609752 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.609771 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-config-data\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.609798 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-scripts\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.609834 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02eab243-e69e-4823-bc6f-2e6b70d5c80d-log-httpd\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.610547 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02eab243-e69e-4823-bc6f-2e6b70d5c80d-log-httpd\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.610805 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02eab243-e69e-4823-bc6f-2e6b70d5c80d-run-httpd\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.614663 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.616590 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.617665 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-config-data\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.617716 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-scripts\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.634960 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7g7c\" (UniqueName: \"kubernetes.io/projected/02eab243-e69e-4823-bc6f-2e6b70d5c80d-kube-api-access-c7g7c\") pod \"ceilometer-0\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.759543 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:10:20 crc kubenswrapper[4757]: I1216 13:10:20.978586 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddca11db-397f-4375-965c-4ab9644469f5" path="/var/lib/kubelet/pods/ddca11db-397f-4375-965c-4ab9644469f5/volumes" Dec 16 13:10:21 crc kubenswrapper[4757]: I1216 13:10:21.267104 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:10:21 crc kubenswrapper[4757]: W1216 13:10:21.282951 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02eab243_e69e_4823_bc6f_2e6b70d5c80d.slice/crio-b76bcae88aac804c6bfb5024d5cbcb4f0660c13b087862a535e6a4ed551057e4 WatchSource:0}: Error finding container b76bcae88aac804c6bfb5024d5cbcb4f0660c13b087862a535e6a4ed551057e4: Status 404 returned error can't find the container with id b76bcae88aac804c6bfb5024d5cbcb4f0660c13b087862a535e6a4ed551057e4 Dec 16 13:10:21 crc kubenswrapper[4757]: I1216 13:10:21.470289 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-75ccc7d896-jmrk9" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 16 13:10:21 crc kubenswrapper[4757]: I1216 13:10:21.584258 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5d66ddf65b-lmltr" podUID="65337bd1-c674-4817-91c2-ad150639205c" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 16 13:10:22 crc kubenswrapper[4757]: I1216 13:10:22.125372 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02eab243-e69e-4823-bc6f-2e6b70d5c80d","Type":"ContainerStarted","Data":"b76bcae88aac804c6bfb5024d5cbcb4f0660c13b087862a535e6a4ed551057e4"} Dec 16 13:10:22 crc kubenswrapper[4757]: E1216 13:10:22.308597 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 13:10:22 crc kubenswrapper[4757]: E1216 13:10:22.310200 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 13:10:22 crc kubenswrapper[4757]: E1216 13:10:22.311495 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 13:10:22 crc kubenswrapper[4757]: E1216 13:10:22.311545 4757 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="517ce20e-e28f-46f9-a2c9-a2fbe7020b15" containerName="nova-cell0-conductor-conductor" Dec 16 13:10:23 crc kubenswrapper[4757]: I1216 13:10:23.140257 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02eab243-e69e-4823-bc6f-2e6b70d5c80d","Type":"ContainerStarted","Data":"918217cd1d1408090d4aa8975a6c182e8dad824497dcdbc2f1a90f19e2da4c26"} Dec 16 13:10:23 crc kubenswrapper[4757]: I1216 13:10:23.140752 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02eab243-e69e-4823-bc6f-2e6b70d5c80d","Type":"ContainerStarted","Data":"efb89a168722bcb14295498f72c5a979cfc30cef53986f772a8d062e33ece741"} Dec 16 13:10:24 crc kubenswrapper[4757]: I1216 13:10:24.155766 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02eab243-e69e-4823-bc6f-2e6b70d5c80d","Type":"ContainerStarted","Data":"74c6d1460b7105d99b13b5487f82664d050b44bab32e6ec47dd7f4895078d96a"} Dec 16 13:10:26 crc kubenswrapper[4757]: I1216 13:10:26.176346 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02eab243-e69e-4823-bc6f-2e6b70d5c80d","Type":"ContainerStarted","Data":"89d66b8088206c9ce386d6c0b3b29c613c82d13e1673d80000c75c1336a20ceb"} Dec 16 13:10:26 crc kubenswrapper[4757]: I1216 13:10:26.176845 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 13:10:26 crc kubenswrapper[4757]: I1216 13:10:26.202458 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.862184186 podStartE2EDuration="6.20243146s" podCreationTimestamp="2025-12-16 13:10:20 +0000 UTC" firstStartedPulling="2025-12-16 13:10:21.289380242 +0000 UTC m=+1406.717124038" lastFinishedPulling="2025-12-16 13:10:25.629627516 +0000 UTC m=+1411.057371312" observedRunningTime="2025-12-16 13:10:26.195404947 +0000 UTC m=+1411.623148763" watchObservedRunningTime="2025-12-16 13:10:26.20243146 +0000 UTC m=+1411.630175256" Dec 16 13:10:27 crc kubenswrapper[4757]: E1216 13:10:27.307267 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 13:10:27 crc kubenswrapper[4757]: E1216 13:10:27.309586 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 13:10:27 crc kubenswrapper[4757]: E1216 13:10:27.311301 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 13:10:27 crc kubenswrapper[4757]: E1216 13:10:27.311394 4757 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="517ce20e-e28f-46f9-a2c9-a2fbe7020b15" containerName="nova-cell0-conductor-conductor" Dec 16 13:10:32 crc kubenswrapper[4757]: E1216 13:10:32.306454 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 13:10:32 crc kubenswrapper[4757]: E1216 13:10:32.308939 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 13:10:32 crc kubenswrapper[4757]: E1216 13:10:32.310435 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 13:10:32 crc kubenswrapper[4757]: E1216 13:10:32.310488 4757 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="517ce20e-e28f-46f9-a2c9-a2fbe7020b15" containerName="nova-cell0-conductor-conductor" Dec 16 13:10:33 crc kubenswrapper[4757]: I1216 13:10:33.973642 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:10:34 crc kubenswrapper[4757]: I1216 13:10:34.011448 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:10:35 crc kubenswrapper[4757]: I1216 13:10:35.829383 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:10:35 crc kubenswrapper[4757]: I1216 13:10:35.837082 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5d66ddf65b-lmltr" Dec 16 13:10:35 crc kubenswrapper[4757]: I1216 13:10:35.936556 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75ccc7d896-jmrk9"] Dec 16 13:10:36 crc kubenswrapper[4757]: I1216 13:10:36.274298 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-75ccc7d896-jmrk9" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon-log" containerID="cri-o://27930a46a589b4d3638de24e488861d0ce2d79305f0f2fcc652982d005f8b8df" gracePeriod=30 Dec 16 13:10:36 crc kubenswrapper[4757]: I1216 13:10:36.274336 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-75ccc7d896-jmrk9" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" containerID="cri-o://c796a6497876b25321bec8486b437aa7c77c35a00f83ae805bc05496ed54f2cc" gracePeriod=30 Dec 16 13:10:37 crc kubenswrapper[4757]: E1216 13:10:37.306567 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 13:10:37 crc kubenswrapper[4757]: E1216 13:10:37.308475 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 13:10:37 crc kubenswrapper[4757]: E1216 13:10:37.309904 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 16 13:10:37 crc kubenswrapper[4757]: E1216 13:10:37.309991 4757 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="517ce20e-e28f-46f9-a2c9-a2fbe7020b15" containerName="nova-cell0-conductor-conductor" Dec 16 13:10:40 crc kubenswrapper[4757]: I1216 13:10:40.307965 4757 generic.go:334] "Generic (PLEG): container finished" podID="399f2693-64b1-4958-ad75-49c45b448ed5" containerID="c796a6497876b25321bec8486b437aa7c77c35a00f83ae805bc05496ed54f2cc" exitCode=0 Dec 16 13:10:40 crc kubenswrapper[4757]: I1216 13:10:40.308534 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75ccc7d896-jmrk9" event={"ID":"399f2693-64b1-4958-ad75-49c45b448ed5","Type":"ContainerDied","Data":"c796a6497876b25321bec8486b437aa7c77c35a00f83ae805bc05496ed54f2cc"} Dec 16 13:10:40 crc kubenswrapper[4757]: I1216 13:10:40.308578 4757 scope.go:117] "RemoveContainer" containerID="bc30dc5c1ec1bde48a9125e46de96e8a257135a5fb2342aa9590cfa8754f8773" Dec 16 13:10:41 crc kubenswrapper[4757]: I1216 13:10:41.316701 4757 generic.go:334] "Generic (PLEG): container finished" podID="517ce20e-e28f-46f9-a2c9-a2fbe7020b15" containerID="0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0" exitCode=137 Dec 16 13:10:41 crc kubenswrapper[4757]: I1216 13:10:41.316795 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"517ce20e-e28f-46f9-a2c9-a2fbe7020b15","Type":"ContainerDied","Data":"0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0"} Dec 16 13:10:41 crc kubenswrapper[4757]: I1216 13:10:41.317094 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"517ce20e-e28f-46f9-a2c9-a2fbe7020b15","Type":"ContainerDied","Data":"687dbab5108e8dabeeae8912ca9551a12953d5a675b8be572aecfac619492b53"} Dec 16 13:10:41 crc kubenswrapper[4757]: I1216 13:10:41.317110 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="687dbab5108e8dabeeae8912ca9551a12953d5a675b8be572aecfac619492b53" Dec 16 13:10:41 crc kubenswrapper[4757]: I1216 13:10:41.386211 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:41 crc kubenswrapper[4757]: I1216 13:10:41.469042 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-75ccc7d896-jmrk9" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 16 13:10:41 crc kubenswrapper[4757]: I1216 13:10:41.514091 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/517ce20e-e28f-46f9-a2c9-a2fbe7020b15-combined-ca-bundle\") pod \"517ce20e-e28f-46f9-a2c9-a2fbe7020b15\" (UID: \"517ce20e-e28f-46f9-a2c9-a2fbe7020b15\") " Dec 16 13:10:41 crc kubenswrapper[4757]: I1216 13:10:41.514262 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czlgt\" (UniqueName: \"kubernetes.io/projected/517ce20e-e28f-46f9-a2c9-a2fbe7020b15-kube-api-access-czlgt\") pod \"517ce20e-e28f-46f9-a2c9-a2fbe7020b15\" (UID: \"517ce20e-e28f-46f9-a2c9-a2fbe7020b15\") " Dec 16 13:10:41 crc kubenswrapper[4757]: I1216 13:10:41.514358 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/517ce20e-e28f-46f9-a2c9-a2fbe7020b15-config-data\") pod \"517ce20e-e28f-46f9-a2c9-a2fbe7020b15\" (UID: \"517ce20e-e28f-46f9-a2c9-a2fbe7020b15\") " Dec 16 13:10:41 crc kubenswrapper[4757]: I1216 13:10:41.520309 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/517ce20e-e28f-46f9-a2c9-a2fbe7020b15-kube-api-access-czlgt" (OuterVolumeSpecName: "kube-api-access-czlgt") pod "517ce20e-e28f-46f9-a2c9-a2fbe7020b15" (UID: "517ce20e-e28f-46f9-a2c9-a2fbe7020b15"). InnerVolumeSpecName "kube-api-access-czlgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:10:41 crc kubenswrapper[4757]: I1216 13:10:41.539654 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/517ce20e-e28f-46f9-a2c9-a2fbe7020b15-config-data" (OuterVolumeSpecName: "config-data") pod "517ce20e-e28f-46f9-a2c9-a2fbe7020b15" (UID: "517ce20e-e28f-46f9-a2c9-a2fbe7020b15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:10:41 crc kubenswrapper[4757]: I1216 13:10:41.545918 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/517ce20e-e28f-46f9-a2c9-a2fbe7020b15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "517ce20e-e28f-46f9-a2c9-a2fbe7020b15" (UID: "517ce20e-e28f-46f9-a2c9-a2fbe7020b15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:10:41 crc kubenswrapper[4757]: I1216 13:10:41.617141 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/517ce20e-e28f-46f9-a2c9-a2fbe7020b15-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:10:41 crc kubenswrapper[4757]: I1216 13:10:41.617172 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/517ce20e-e28f-46f9-a2c9-a2fbe7020b15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:10:41 crc kubenswrapper[4757]: I1216 13:10:41.617184 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czlgt\" (UniqueName: \"kubernetes.io/projected/517ce20e-e28f-46f9-a2c9-a2fbe7020b15-kube-api-access-czlgt\") on node \"crc\" DevicePath \"\"" Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.328979 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.364315 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.372743 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.391539 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 13:10:42 crc kubenswrapper[4757]: E1216 13:10:42.392361 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="517ce20e-e28f-46f9-a2c9-a2fbe7020b15" containerName="nova-cell0-conductor-conductor" Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.392502 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="517ce20e-e28f-46f9-a2c9-a2fbe7020b15" containerName="nova-cell0-conductor-conductor" Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.392856 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="517ce20e-e28f-46f9-a2c9-a2fbe7020b15" containerName="nova-cell0-conductor-conductor" Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.394618 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.399466 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-55n4r" Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.399651 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.405399 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.535061 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-642lf\" (UniqueName: \"kubernetes.io/projected/614c552b-9e07-4f84-becd-3dfa75851309-kube-api-access-642lf\") pod \"nova-cell0-conductor-0\" (UID: \"614c552b-9e07-4f84-becd-3dfa75851309\") " pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.535225 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614c552b-9e07-4f84-becd-3dfa75851309-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"614c552b-9e07-4f84-becd-3dfa75851309\") " pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.535257 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/614c552b-9e07-4f84-becd-3dfa75851309-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"614c552b-9e07-4f84-becd-3dfa75851309\") " pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.637680 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614c552b-9e07-4f84-becd-3dfa75851309-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"614c552b-9e07-4f84-becd-3dfa75851309\") " pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.637736 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/614c552b-9e07-4f84-becd-3dfa75851309-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"614c552b-9e07-4f84-becd-3dfa75851309\") " pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.637776 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-642lf\" (UniqueName: \"kubernetes.io/projected/614c552b-9e07-4f84-becd-3dfa75851309-kube-api-access-642lf\") pod \"nova-cell0-conductor-0\" (UID: \"614c552b-9e07-4f84-becd-3dfa75851309\") " pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.644033 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614c552b-9e07-4f84-becd-3dfa75851309-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"614c552b-9e07-4f84-becd-3dfa75851309\") " pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.646716 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/614c552b-9e07-4f84-becd-3dfa75851309-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"614c552b-9e07-4f84-becd-3dfa75851309\") " pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.659121 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-642lf\" (UniqueName: \"kubernetes.io/projected/614c552b-9e07-4f84-becd-3dfa75851309-kube-api-access-642lf\") pod \"nova-cell0-conductor-0\" (UID: \"614c552b-9e07-4f84-becd-3dfa75851309\") " pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.747561 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:42 crc kubenswrapper[4757]: I1216 13:10:42.974441 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="517ce20e-e28f-46f9-a2c9-a2fbe7020b15" path="/var/lib/kubelet/pods/517ce20e-e28f-46f9-a2c9-a2fbe7020b15/volumes" Dec 16 13:10:43 crc kubenswrapper[4757]: I1216 13:10:43.257721 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 13:10:43 crc kubenswrapper[4757]: I1216 13:10:43.343496 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"614c552b-9e07-4f84-becd-3dfa75851309","Type":"ContainerStarted","Data":"c0b61b41d31fcdf747469074f901959e5d67d42a25894079278f941f0e16460c"} Dec 16 13:10:44 crc kubenswrapper[4757]: I1216 13:10:44.354451 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"614c552b-9e07-4f84-becd-3dfa75851309","Type":"ContainerStarted","Data":"efeb3985b14c83faabcff5f54f3b3eae09c94cc0f10cc771784b715dd0c339aa"} Dec 16 13:10:44 crc kubenswrapper[4757]: I1216 13:10:44.356190 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:44 crc kubenswrapper[4757]: I1216 13:10:44.390236 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.390214493 podStartE2EDuration="2.390214493s" podCreationTimestamp="2025-12-16 13:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:10:44.379047766 +0000 UTC m=+1429.806791572" watchObservedRunningTime="2025-12-16 13:10:44.390214493 +0000 UTC m=+1429.817958309" Dec 16 13:10:50 crc kubenswrapper[4757]: I1216 13:10:50.769427 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 16 13:10:51 crc kubenswrapper[4757]: I1216 13:10:51.181843 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:10:51 crc kubenswrapper[4757]: I1216 13:10:51.181947 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:10:51 crc kubenswrapper[4757]: I1216 13:10:51.468686 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-75ccc7d896-jmrk9" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 16 13:10:52 crc kubenswrapper[4757]: I1216 13:10:52.780094 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.532610 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-b4m4q"] Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.534157 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-b4m4q" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.535786 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.537171 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.550656 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-b4m4q"] Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.739471 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.740822 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-scripts\") pod \"nova-cell0-cell-mapping-b4m4q\" (UID: \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\") " pod="openstack/nova-cell0-cell-mapping-b4m4q" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.740970 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnptg\" (UniqueName: \"kubernetes.io/projected/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-kube-api-access-wnptg\") pod \"nova-cell0-cell-mapping-b4m4q\" (UID: \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\") " pod="openstack/nova-cell0-cell-mapping-b4m4q" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.741108 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-config-data\") pod \"nova-cell0-cell-mapping-b4m4q\" (UID: \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\") " pod="openstack/nova-cell0-cell-mapping-b4m4q" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.741225 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-b4m4q\" (UID: \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\") " pod="openstack/nova-cell0-cell-mapping-b4m4q" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.741446 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.754995 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.763446 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.843101 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-b4m4q\" (UID: \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\") " pod="openstack/nova-cell0-cell-mapping-b4m4q" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.843170 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-config-data\") pod \"nova-api-0\" (UID: \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\") " pod="openstack/nova-api-0" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.843213 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-logs\") pod \"nova-api-0\" (UID: \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\") " pod="openstack/nova-api-0" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.843241 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcgbq\" (UniqueName: \"kubernetes.io/projected/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-kube-api-access-fcgbq\") pod \"nova-api-0\" (UID: \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\") " pod="openstack/nova-api-0" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.843287 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-scripts\") pod \"nova-cell0-cell-mapping-b4m4q\" (UID: \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\") " pod="openstack/nova-cell0-cell-mapping-b4m4q" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.843318 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnptg\" (UniqueName: \"kubernetes.io/projected/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-kube-api-access-wnptg\") pod \"nova-cell0-cell-mapping-b4m4q\" (UID: \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\") " pod="openstack/nova-cell0-cell-mapping-b4m4q" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.843343 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\") " pod="openstack/nova-api-0" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.843370 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-config-data\") pod \"nova-cell0-cell-mapping-b4m4q\" (UID: \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\") " pod="openstack/nova-cell0-cell-mapping-b4m4q" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.852434 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-scripts\") pod \"nova-cell0-cell-mapping-b4m4q\" (UID: \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\") " pod="openstack/nova-cell0-cell-mapping-b4m4q" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.852479 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-config-data\") pod \"nova-cell0-cell-mapping-b4m4q\" (UID: \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\") " pod="openstack/nova-cell0-cell-mapping-b4m4q" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.857708 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-b4m4q\" (UID: \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\") " pod="openstack/nova-cell0-cell-mapping-b4m4q" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.883244 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.884821 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.891383 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.946171 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\") " pod="openstack/nova-api-0" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.946774 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-config-data\") pod \"nova-api-0\" (UID: \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\") " pod="openstack/nova-api-0" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.946842 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-logs\") pod \"nova-api-0\" (UID: \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\") " pod="openstack/nova-api-0" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.946884 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcgbq\" (UniqueName: \"kubernetes.io/projected/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-kube-api-access-fcgbq\") pod \"nova-api-0\" (UID: \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\") " pod="openstack/nova-api-0" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.947742 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-logs\") pod \"nova-api-0\" (UID: \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\") " pod="openstack/nova-api-0" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.964788 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-config-data\") pod \"nova-api-0\" (UID: \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\") " pod="openstack/nova-api-0" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.965167 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.972222 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\") " pod="openstack/nova-api-0" Dec 16 13:10:53 crc kubenswrapper[4757]: I1216 13:10:53.993721 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnptg\" (UniqueName: \"kubernetes.io/projected/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-kube-api-access-wnptg\") pod \"nova-cell0-cell-mapping-b4m4q\" (UID: \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\") " pod="openstack/nova-cell0-cell-mapping-b4m4q" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.009943 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.011551 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.021368 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.032668 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcgbq\" (UniqueName: \"kubernetes.io/projected/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-kube-api-access-fcgbq\") pod \"nova-api-0\" (UID: \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\") " pod="openstack/nova-api-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.049767 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.054894 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mflgx\" (UniqueName: \"kubernetes.io/projected/0357997f-6169-4fc2-9c56-e4ccfb8fb694-kube-api-access-mflgx\") pod \"nova-cell1-novncproxy-0\" (UID: \"0357997f-6169-4fc2-9c56-e4ccfb8fb694\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.055324 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0357997f-6169-4fc2-9c56-e4ccfb8fb694-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0357997f-6169-4fc2-9c56-e4ccfb8fb694\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.055614 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gfvv\" (UniqueName: \"kubernetes.io/projected/c8a8bd03-898d-43fb-84d1-700e8fff5a26-kube-api-access-4gfvv\") pod \"nova-scheduler-0\" (UID: \"c8a8bd03-898d-43fb-84d1-700e8fff5a26\") " pod="openstack/nova-scheduler-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.055771 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a8bd03-898d-43fb-84d1-700e8fff5a26-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c8a8bd03-898d-43fb-84d1-700e8fff5a26\") " pod="openstack/nova-scheduler-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.055941 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a8bd03-898d-43fb-84d1-700e8fff5a26-config-data\") pod \"nova-scheduler-0\" (UID: \"c8a8bd03-898d-43fb-84d1-700e8fff5a26\") " pod="openstack/nova-scheduler-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.056055 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0357997f-6169-4fc2-9c56-e4ccfb8fb694-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0357997f-6169-4fc2-9c56-e4ccfb8fb694\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.077558 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.158333 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gfvv\" (UniqueName: \"kubernetes.io/projected/c8a8bd03-898d-43fb-84d1-700e8fff5a26-kube-api-access-4gfvv\") pod \"nova-scheduler-0\" (UID: \"c8a8bd03-898d-43fb-84d1-700e8fff5a26\") " pod="openstack/nova-scheduler-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.158680 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a8bd03-898d-43fb-84d1-700e8fff5a26-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c8a8bd03-898d-43fb-84d1-700e8fff5a26\") " pod="openstack/nova-scheduler-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.158832 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a8bd03-898d-43fb-84d1-700e8fff5a26-config-data\") pod \"nova-scheduler-0\" (UID: \"c8a8bd03-898d-43fb-84d1-700e8fff5a26\") " pod="openstack/nova-scheduler-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.158947 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0357997f-6169-4fc2-9c56-e4ccfb8fb694-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0357997f-6169-4fc2-9c56-e4ccfb8fb694\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.159113 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mflgx\" (UniqueName: \"kubernetes.io/projected/0357997f-6169-4fc2-9c56-e4ccfb8fb694-kube-api-access-mflgx\") pod \"nova-cell1-novncproxy-0\" (UID: \"0357997f-6169-4fc2-9c56-e4ccfb8fb694\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.159228 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0357997f-6169-4fc2-9c56-e4ccfb8fb694-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0357997f-6169-4fc2-9c56-e4ccfb8fb694\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.164913 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0357997f-6169-4fc2-9c56-e4ccfb8fb694-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0357997f-6169-4fc2-9c56-e4ccfb8fb694\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.172875 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a8bd03-898d-43fb-84d1-700e8fff5a26-config-data\") pod \"nova-scheduler-0\" (UID: \"c8a8bd03-898d-43fb-84d1-700e8fff5a26\") " pod="openstack/nova-scheduler-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.173587 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0357997f-6169-4fc2-9c56-e4ccfb8fb694-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0357997f-6169-4fc2-9c56-e4ccfb8fb694\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.176726 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a8bd03-898d-43fb-84d1-700e8fff5a26-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c8a8bd03-898d-43fb-84d1-700e8fff5a26\") " pod="openstack/nova-scheduler-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.183201 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.185125 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.190479 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-b4m4q" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.205444 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.266626 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mflgx\" (UniqueName: \"kubernetes.io/projected/0357997f-6169-4fc2-9c56-e4ccfb8fb694-kube-api-access-mflgx\") pod \"nova-cell1-novncproxy-0\" (UID: \"0357997f-6169-4fc2-9c56-e4ccfb8fb694\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.290654 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gfvv\" (UniqueName: \"kubernetes.io/projected/c8a8bd03-898d-43fb-84d1-700e8fff5a26-kube-api-access-4gfvv\") pod \"nova-scheduler-0\" (UID: \"c8a8bd03-898d-43fb-84d1-700e8fff5a26\") " pod="openstack/nova-scheduler-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.307068 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.369216 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-config-data\") pod \"nova-metadata-0\" (UID: \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\") " pod="openstack/nova-metadata-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.369276 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdc26\" (UniqueName: \"kubernetes.io/projected/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-kube-api-access-mdc26\") pod \"nova-metadata-0\" (UID: \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\") " pod="openstack/nova-metadata-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.369313 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-logs\") pod \"nova-metadata-0\" (UID: \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\") " pod="openstack/nova-metadata-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.369334 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\") " pod="openstack/nova-metadata-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.432520 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.442784 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.446594 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-tsdxz"] Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.448290 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.470779 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-config-data\") pod \"nova-metadata-0\" (UID: \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\") " pod="openstack/nova-metadata-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.470820 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdc26\" (UniqueName: \"kubernetes.io/projected/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-kube-api-access-mdc26\") pod \"nova-metadata-0\" (UID: \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\") " pod="openstack/nova-metadata-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.470862 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-logs\") pod \"nova-metadata-0\" (UID: \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\") " pod="openstack/nova-metadata-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.470884 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\") " pod="openstack/nova-metadata-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.472620 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-logs\") pod \"nova-metadata-0\" (UID: \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\") " pod="openstack/nova-metadata-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.479954 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-config-data\") pod \"nova-metadata-0\" (UID: \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\") " pod="openstack/nova-metadata-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.496165 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\") " pod="openstack/nova-metadata-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.556898 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdc26\" (UniqueName: \"kubernetes.io/projected/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-kube-api-access-mdc26\") pod \"nova-metadata-0\" (UID: \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\") " pod="openstack/nova-metadata-0" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.557239 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-tsdxz"] Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.573380 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-tsdxz\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.573474 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsvpm\" (UniqueName: \"kubernetes.io/projected/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-kube-api-access-nsvpm\") pod \"dnsmasq-dns-bccf8f775-tsdxz\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.573599 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-config\") pod \"dnsmasq-dns-bccf8f775-tsdxz\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.573629 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-tsdxz\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.573669 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-dns-svc\") pod \"dnsmasq-dns-bccf8f775-tsdxz\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.573745 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-tsdxz\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.684198 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-tsdxz\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.684280 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsvpm\" (UniqueName: \"kubernetes.io/projected/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-kube-api-access-nsvpm\") pod \"dnsmasq-dns-bccf8f775-tsdxz\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.684380 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-config\") pod \"dnsmasq-dns-bccf8f775-tsdxz\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.684413 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-tsdxz\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.684452 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-dns-svc\") pod \"dnsmasq-dns-bccf8f775-tsdxz\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.684531 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-tsdxz\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.685788 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-tsdxz\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.686485 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-tsdxz\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.687474 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-config\") pod \"dnsmasq-dns-bccf8f775-tsdxz\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.703399 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-dns-svc\") pod \"dnsmasq-dns-bccf8f775-tsdxz\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.703612 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-tsdxz\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.742250 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsvpm\" (UniqueName: \"kubernetes.io/projected/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-kube-api-access-nsvpm\") pod \"dnsmasq-dns-bccf8f775-tsdxz\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.797402 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:54 crc kubenswrapper[4757]: I1216 13:10:54.805751 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 13:10:55 crc kubenswrapper[4757]: I1216 13:10:55.277959 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 13:10:55 crc kubenswrapper[4757]: I1216 13:10:55.472267 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-b4m4q"] Dec 16 13:10:55 crc kubenswrapper[4757]: I1216 13:10:55.522935 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 13:10:55 crc kubenswrapper[4757]: I1216 13:10:55.560084 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d35ab5a-645f-40c9-ba7a-288a5ed7722a","Type":"ContainerStarted","Data":"d6300d7b8ff2ed3764443c70d28e1065694c49b3a805bdc3f49c3a84b536ace9"} Dec 16 13:10:55 crc kubenswrapper[4757]: I1216 13:10:55.704844 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:10:55 crc kubenswrapper[4757]: W1216 13:10:55.720831 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0357997f_6169_4fc2_9c56_e4ccfb8fb694.slice/crio-63d211ec315f5fdc9626001e2765f99b27d9d0d39dc4aaf6a0d7d93845be90ce WatchSource:0}: Error finding container 63d211ec315f5fdc9626001e2765f99b27d9d0d39dc4aaf6a0d7d93845be90ce: Status 404 returned error can't find the container with id 63d211ec315f5fdc9626001e2765f99b27d9d0d39dc4aaf6a0d7d93845be90ce Dec 16 13:10:55 crc kubenswrapper[4757]: I1216 13:10:55.734879 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 13:10:55 crc kubenswrapper[4757]: I1216 13:10:55.857182 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hwttf"] Dec 16 13:10:55 crc kubenswrapper[4757]: I1216 13:10:55.858847 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hwttf" Dec 16 13:10:55 crc kubenswrapper[4757]: I1216 13:10:55.863603 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 16 13:10:55 crc kubenswrapper[4757]: I1216 13:10:55.863805 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 16 13:10:55 crc kubenswrapper[4757]: I1216 13:10:55.901580 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hwttf"] Dec 16 13:10:55 crc kubenswrapper[4757]: I1216 13:10:55.938102 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-tsdxz"] Dec 16 13:10:55 crc kubenswrapper[4757]: W1216 13:10:55.948782 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5361789a_bbaa_4f8f_a6c1_c1bda3e8cfff.slice/crio-123e0a83426f420ee9acb931ae1e9f9d02502b2a2cdeae21408e14693f1a78e4 WatchSource:0}: Error finding container 123e0a83426f420ee9acb931ae1e9f9d02502b2a2cdeae21408e14693f1a78e4: Status 404 returned error can't find the container with id 123e0a83426f420ee9acb931ae1e9f9d02502b2a2cdeae21408e14693f1a78e4 Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.035359 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c6b1bba-b68a-4912-aada-0229a7152426-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hwttf\" (UID: \"1c6b1bba-b68a-4912-aada-0229a7152426\") " pod="openstack/nova-cell1-conductor-db-sync-hwttf" Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.035731 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgzdg\" (UniqueName: \"kubernetes.io/projected/1c6b1bba-b68a-4912-aada-0229a7152426-kube-api-access-vgzdg\") pod \"nova-cell1-conductor-db-sync-hwttf\" (UID: \"1c6b1bba-b68a-4912-aada-0229a7152426\") " pod="openstack/nova-cell1-conductor-db-sync-hwttf" Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.035759 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c6b1bba-b68a-4912-aada-0229a7152426-config-data\") pod \"nova-cell1-conductor-db-sync-hwttf\" (UID: \"1c6b1bba-b68a-4912-aada-0229a7152426\") " pod="openstack/nova-cell1-conductor-db-sync-hwttf" Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.035968 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c6b1bba-b68a-4912-aada-0229a7152426-scripts\") pod \"nova-cell1-conductor-db-sync-hwttf\" (UID: \"1c6b1bba-b68a-4912-aada-0229a7152426\") " pod="openstack/nova-cell1-conductor-db-sync-hwttf" Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.166625 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c6b1bba-b68a-4912-aada-0229a7152426-scripts\") pod \"nova-cell1-conductor-db-sync-hwttf\" (UID: \"1c6b1bba-b68a-4912-aada-0229a7152426\") " pod="openstack/nova-cell1-conductor-db-sync-hwttf" Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.166790 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c6b1bba-b68a-4912-aada-0229a7152426-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hwttf\" (UID: \"1c6b1bba-b68a-4912-aada-0229a7152426\") " pod="openstack/nova-cell1-conductor-db-sync-hwttf" Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.166824 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgzdg\" (UniqueName: \"kubernetes.io/projected/1c6b1bba-b68a-4912-aada-0229a7152426-kube-api-access-vgzdg\") pod \"nova-cell1-conductor-db-sync-hwttf\" (UID: \"1c6b1bba-b68a-4912-aada-0229a7152426\") " pod="openstack/nova-cell1-conductor-db-sync-hwttf" Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.166853 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c6b1bba-b68a-4912-aada-0229a7152426-config-data\") pod \"nova-cell1-conductor-db-sync-hwttf\" (UID: \"1c6b1bba-b68a-4912-aada-0229a7152426\") " pod="openstack/nova-cell1-conductor-db-sync-hwttf" Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.181054 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c6b1bba-b68a-4912-aada-0229a7152426-scripts\") pod \"nova-cell1-conductor-db-sync-hwttf\" (UID: \"1c6b1bba-b68a-4912-aada-0229a7152426\") " pod="openstack/nova-cell1-conductor-db-sync-hwttf" Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.191818 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgzdg\" (UniqueName: \"kubernetes.io/projected/1c6b1bba-b68a-4912-aada-0229a7152426-kube-api-access-vgzdg\") pod \"nova-cell1-conductor-db-sync-hwttf\" (UID: \"1c6b1bba-b68a-4912-aada-0229a7152426\") " pod="openstack/nova-cell1-conductor-db-sync-hwttf" Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.193546 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c6b1bba-b68a-4912-aada-0229a7152426-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hwttf\" (UID: \"1c6b1bba-b68a-4912-aada-0229a7152426\") " pod="openstack/nova-cell1-conductor-db-sync-hwttf" Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.194833 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c6b1bba-b68a-4912-aada-0229a7152426-config-data\") pod \"nova-cell1-conductor-db-sync-hwttf\" (UID: \"1c6b1bba-b68a-4912-aada-0229a7152426\") " pod="openstack/nova-cell1-conductor-db-sync-hwttf" Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.208487 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hwttf" Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.584335 4757 generic.go:334] "Generic (PLEG): container finished" podID="5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff" containerID="4bec5ae47d5ca16ebe65ad9171ac584d9eb1e957faaccfd36319aeb5243beb7f" exitCode=0 Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.585231 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" event={"ID":"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff","Type":"ContainerDied","Data":"4bec5ae47d5ca16ebe65ad9171ac584d9eb1e957faaccfd36319aeb5243beb7f"} Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.585283 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" event={"ID":"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff","Type":"ContainerStarted","Data":"123e0a83426f420ee9acb931ae1e9f9d02502b2a2cdeae21408e14693f1a78e4"} Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.607067 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-b4m4q" event={"ID":"32ea5e67-160d-47fd-9bb3-70141a4bcdb1","Type":"ContainerStarted","Data":"bb5bac903cac19e8af0a42238e02f346db20cfc7fc52e6187c10601921713636"} Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.607108 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-b4m4q" event={"ID":"32ea5e67-160d-47fd-9bb3-70141a4bcdb1","Type":"ContainerStarted","Data":"2c9b87277e05d98ef77a3269deea28f8da5d7549c900889243f4a1f6833d27c9"} Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.637568 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6","Type":"ContainerStarted","Data":"bdc9ab6a2a4c33a523de1062182969c87d2c337091f2ab32f1ddb8cef08087fe"} Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.643267 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0357997f-6169-4fc2-9c56-e4ccfb8fb694","Type":"ContainerStarted","Data":"63d211ec315f5fdc9626001e2765f99b27d9d0d39dc4aaf6a0d7d93845be90ce"} Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.669685 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c8a8bd03-898d-43fb-84d1-700e8fff5a26","Type":"ContainerStarted","Data":"95a87bd64251adcb47e73fd7b2b3dd944d27b91b89dcc2fb2038e8f7b93884bf"} Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.688075 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-b4m4q" podStartSLOduration=3.687993522 podStartE2EDuration="3.687993522s" podCreationTimestamp="2025-12-16 13:10:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:10:56.675087332 +0000 UTC m=+1442.102831128" watchObservedRunningTime="2025-12-16 13:10:56.687993522 +0000 UTC m=+1442.115737318" Dec 16 13:10:56 crc kubenswrapper[4757]: I1216 13:10:56.841767 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hwttf"] Dec 16 13:10:56 crc kubenswrapper[4757]: W1216 13:10:56.868742 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c6b1bba_b68a_4912_aada_0229a7152426.slice/crio-d7d9a78689b10e989ad0550955006abcbf461e038ee689ddeebdba8b1fc12418 WatchSource:0}: Error finding container d7d9a78689b10e989ad0550955006abcbf461e038ee689ddeebdba8b1fc12418: Status 404 returned error can't find the container with id d7d9a78689b10e989ad0550955006abcbf461e038ee689ddeebdba8b1fc12418 Dec 16 13:10:57 crc kubenswrapper[4757]: I1216 13:10:57.692114 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" event={"ID":"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff","Type":"ContainerStarted","Data":"58ff04a57656f775a9c1ef03e9c0bde8af2ec49fb928b2e9c092889d83a8dcbc"} Dec 16 13:10:57 crc kubenswrapper[4757]: I1216 13:10:57.695201 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:10:57 crc kubenswrapper[4757]: I1216 13:10:57.706277 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hwttf" event={"ID":"1c6b1bba-b68a-4912-aada-0229a7152426","Type":"ContainerStarted","Data":"3de9e060dc0c56b825fb723b67ad28f9320965646c9a8d7934c870822718e4e7"} Dec 16 13:10:57 crc kubenswrapper[4757]: I1216 13:10:57.706329 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hwttf" event={"ID":"1c6b1bba-b68a-4912-aada-0229a7152426","Type":"ContainerStarted","Data":"d7d9a78689b10e989ad0550955006abcbf461e038ee689ddeebdba8b1fc12418"} Dec 16 13:10:57 crc kubenswrapper[4757]: I1216 13:10:57.725665 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" podStartSLOduration=3.725643405 podStartE2EDuration="3.725643405s" podCreationTimestamp="2025-12-16 13:10:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:10:57.71735436 +0000 UTC m=+1443.145098166" watchObservedRunningTime="2025-12-16 13:10:57.725643405 +0000 UTC m=+1443.153387201" Dec 16 13:10:58 crc kubenswrapper[4757]: I1216 13:10:58.457487 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-hwttf" podStartSLOduration=3.457468601 podStartE2EDuration="3.457468601s" podCreationTimestamp="2025-12-16 13:10:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:10:57.736079804 +0000 UTC m=+1443.163823620" watchObservedRunningTime="2025-12-16 13:10:58.457468601 +0000 UTC m=+1443.885212397" Dec 16 13:10:58 crc kubenswrapper[4757]: I1216 13:10:58.461759 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:10:58 crc kubenswrapper[4757]: I1216 13:10:58.470422 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 13:10:59 crc kubenswrapper[4757]: I1216 13:10:59.258425 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 13:10:59 crc kubenswrapper[4757]: I1216 13:10:59.258678 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="6cfaebf1-0d50-42e7-9f5a-94b0894a0a46" containerName="kube-state-metrics" containerID="cri-o://03d15000b00351d5d25b7208264e452cfc388862e89f1407d069b9b70859c816" gracePeriod=30 Dec 16 13:10:59 crc kubenswrapper[4757]: I1216 13:10:59.802142 4757 generic.go:334] "Generic (PLEG): container finished" podID="6cfaebf1-0d50-42e7-9f5a-94b0894a0a46" containerID="03d15000b00351d5d25b7208264e452cfc388862e89f1407d069b9b70859c816" exitCode=2 Dec 16 13:10:59 crc kubenswrapper[4757]: I1216 13:10:59.803102 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6cfaebf1-0d50-42e7-9f5a-94b0894a0a46","Type":"ContainerDied","Data":"03d15000b00351d5d25b7208264e452cfc388862e89f1407d069b9b70859c816"} Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.134130 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.183915 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld2mn\" (UniqueName: \"kubernetes.io/projected/6cfaebf1-0d50-42e7-9f5a-94b0894a0a46-kube-api-access-ld2mn\") pod \"6cfaebf1-0d50-42e7-9f5a-94b0894a0a46\" (UID: \"6cfaebf1-0d50-42e7-9f5a-94b0894a0a46\") " Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.238743 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cfaebf1-0d50-42e7-9f5a-94b0894a0a46-kube-api-access-ld2mn" (OuterVolumeSpecName: "kube-api-access-ld2mn") pod "6cfaebf1-0d50-42e7-9f5a-94b0894a0a46" (UID: "6cfaebf1-0d50-42e7-9f5a-94b0894a0a46"). InnerVolumeSpecName "kube-api-access-ld2mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.286249 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld2mn\" (UniqueName: \"kubernetes.io/projected/6cfaebf1-0d50-42e7-9f5a-94b0894a0a46-kube-api-access-ld2mn\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.822054 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c8a8bd03-898d-43fb-84d1-700e8fff5a26","Type":"ContainerStarted","Data":"0c22707850366730ebe75dd0cfe5925f7c956dfb61016d4015be571d36fae94b"} Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.831409 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6cfaebf1-0d50-42e7-9f5a-94b0894a0a46","Type":"ContainerDied","Data":"10e4e6476d546d460f7758c5727f1409ad4ea39efb50e3ad4e9d65830d8d1d77"} Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.831470 4757 scope.go:117] "RemoveContainer" containerID="03d15000b00351d5d25b7208264e452cfc388862e89f1407d069b9b70859c816" Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.831602 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.844758 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.6547223559999997 podStartE2EDuration="7.844734509s" podCreationTimestamp="2025-12-16 13:10:53 +0000 UTC" firstStartedPulling="2025-12-16 13:10:55.552971074 +0000 UTC m=+1440.980714870" lastFinishedPulling="2025-12-16 13:10:59.742983227 +0000 UTC m=+1445.170727023" observedRunningTime="2025-12-16 13:11:00.840884974 +0000 UTC m=+1446.268628770" watchObservedRunningTime="2025-12-16 13:11:00.844734509 +0000 UTC m=+1446.272478305" Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.850298 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6","Type":"ContainerStarted","Data":"cb767248ccfd77799dd87e241473a43ab6bb6578eed463e48c8a2dd9b8ccbb0e"} Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.850353 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6","Type":"ContainerStarted","Data":"201f66382bd2c37ccaf018b95f85268f5934e47f96fa069a0063c630e6c149b9"} Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.850506 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fc3b1969-e4b8-425a-af4a-a2ec019ff7e6" containerName="nova-metadata-log" containerID="cri-o://201f66382bd2c37ccaf018b95f85268f5934e47f96fa069a0063c630e6c149b9" gracePeriod=30 Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.850770 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fc3b1969-e4b8-425a-af4a-a2ec019ff7e6" containerName="nova-metadata-metadata" containerID="cri-o://cb767248ccfd77799dd87e241473a43ab6bb6578eed463e48c8a2dd9b8ccbb0e" gracePeriod=30 Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.875333 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0357997f-6169-4fc2-9c56-e4ccfb8fb694","Type":"ContainerStarted","Data":"8aeabf08ed91ef5fc532a84fb36c905138b203118f026b12383227af381a1f2d"} Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.875787 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="0357997f-6169-4fc2-9c56-e4ccfb8fb694" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://8aeabf08ed91ef5fc532a84fb36c905138b203118f026b12383227af381a1f2d" gracePeriod=30 Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.898810 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.906250 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d35ab5a-645f-40c9-ba7a-288a5ed7722a","Type":"ContainerStarted","Data":"b2fe659da9b648e5cd0a279d9855d55482e43dd365f37bbf2c223891db85da32"} Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.906301 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d35ab5a-645f-40c9-ba7a-288a5ed7722a","Type":"ContainerStarted","Data":"ff527dbfa546699e89e7186e5ed187a52a524af9029c37e3b88743cccc5c5789"} Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.914760 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 13:11:00 crc kubenswrapper[4757]: I1216 13:11:00.934707 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.855184445 podStartE2EDuration="6.934687238s" podCreationTimestamp="2025-12-16 13:10:54 +0000 UTC" firstStartedPulling="2025-12-16 13:10:55.703197698 +0000 UTC m=+1441.130941494" lastFinishedPulling="2025-12-16 13:10:59.782700491 +0000 UTC m=+1445.210444287" observedRunningTime="2025-12-16 13:11:00.897354363 +0000 UTC m=+1446.325098159" watchObservedRunningTime="2025-12-16 13:11:00.934687238 +0000 UTC m=+1446.362431034" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:00.993801 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cfaebf1-0d50-42e7-9f5a-94b0894a0a46" path="/var/lib/kubelet/pods/6cfaebf1-0d50-42e7-9f5a-94b0894a0a46/volumes" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:00.996612 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 13:11:01 crc kubenswrapper[4757]: E1216 13:11:00.999166 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cfaebf1-0d50-42e7-9f5a-94b0894a0a46" containerName="kube-state-metrics" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:00.999185 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cfaebf1-0d50-42e7-9f5a-94b0894a0a46" containerName="kube-state-metrics" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.002187 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cfaebf1-0d50-42e7-9f5a-94b0894a0a46" containerName="kube-state-metrics" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.003084 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.014204 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.014456 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.053096 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.094467 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=4.069065895 podStartE2EDuration="8.094447107s" podCreationTimestamp="2025-12-16 13:10:53 +0000 UTC" firstStartedPulling="2025-12-16 13:10:55.726377482 +0000 UTC m=+1441.154121278" lastFinishedPulling="2025-12-16 13:10:59.751758694 +0000 UTC m=+1445.179502490" observedRunningTime="2025-12-16 13:11:00.921871891 +0000 UTC m=+1446.349615677" watchObservedRunningTime="2025-12-16 13:11:01.094447107 +0000 UTC m=+1446.522190903" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.097766 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.655413964 podStartE2EDuration="8.097754299s" podCreationTimestamp="2025-12-16 13:10:53 +0000 UTC" firstStartedPulling="2025-12-16 13:10:55.308580619 +0000 UTC m=+1440.736324415" lastFinishedPulling="2025-12-16 13:10:59.750920954 +0000 UTC m=+1445.178664750" observedRunningTime="2025-12-16 13:11:00.940864581 +0000 UTC m=+1446.368608387" watchObservedRunningTime="2025-12-16 13:11:01.097754299 +0000 UTC m=+1446.525498095" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.118242 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr5r4\" (UniqueName: \"kubernetes.io/projected/2457fa41-c003-450d-a55e-f67c36155f94-kube-api-access-rr5r4\") pod \"kube-state-metrics-0\" (UID: \"2457fa41-c003-450d-a55e-f67c36155f94\") " pod="openstack/kube-state-metrics-0" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.119308 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2457fa41-c003-450d-a55e-f67c36155f94-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2457fa41-c003-450d-a55e-f67c36155f94\") " pod="openstack/kube-state-metrics-0" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.119397 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2457fa41-c003-450d-a55e-f67c36155f94-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2457fa41-c003-450d-a55e-f67c36155f94\") " pod="openstack/kube-state-metrics-0" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.120111 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2457fa41-c003-450d-a55e-f67c36155f94-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2457fa41-c003-450d-a55e-f67c36155f94\") " pod="openstack/kube-state-metrics-0" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.221395 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr5r4\" (UniqueName: \"kubernetes.io/projected/2457fa41-c003-450d-a55e-f67c36155f94-kube-api-access-rr5r4\") pod \"kube-state-metrics-0\" (UID: \"2457fa41-c003-450d-a55e-f67c36155f94\") " pod="openstack/kube-state-metrics-0" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.221499 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2457fa41-c003-450d-a55e-f67c36155f94-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2457fa41-c003-450d-a55e-f67c36155f94\") " pod="openstack/kube-state-metrics-0" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.221585 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2457fa41-c003-450d-a55e-f67c36155f94-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2457fa41-c003-450d-a55e-f67c36155f94\") " pod="openstack/kube-state-metrics-0" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.221626 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2457fa41-c003-450d-a55e-f67c36155f94-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2457fa41-c003-450d-a55e-f67c36155f94\") " pod="openstack/kube-state-metrics-0" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.230886 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2457fa41-c003-450d-a55e-f67c36155f94-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2457fa41-c003-450d-a55e-f67c36155f94\") " pod="openstack/kube-state-metrics-0" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.246680 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2457fa41-c003-450d-a55e-f67c36155f94-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2457fa41-c003-450d-a55e-f67c36155f94\") " pod="openstack/kube-state-metrics-0" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.247233 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2457fa41-c003-450d-a55e-f67c36155f94-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2457fa41-c003-450d-a55e-f67c36155f94\") " pod="openstack/kube-state-metrics-0" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.261748 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr5r4\" (UniqueName: \"kubernetes.io/projected/2457fa41-c003-450d-a55e-f67c36155f94-kube-api-access-rr5r4\") pod \"kube-state-metrics-0\" (UID: \"2457fa41-c003-450d-a55e-f67c36155f94\") " pod="openstack/kube-state-metrics-0" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.350860 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.468518 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-75ccc7d896-jmrk9" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.469030 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.924464 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.925616 4757 generic.go:334] "Generic (PLEG): container finished" podID="fc3b1969-e4b8-425a-af4a-a2ec019ff7e6" containerID="cb767248ccfd77799dd87e241473a43ab6bb6578eed463e48c8a2dd9b8ccbb0e" exitCode=0 Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.925640 4757 generic.go:334] "Generic (PLEG): container finished" podID="fc3b1969-e4b8-425a-af4a-a2ec019ff7e6" containerID="201f66382bd2c37ccaf018b95f85268f5934e47f96fa069a0063c630e6c149b9" exitCode=143 Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.926115 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6","Type":"ContainerDied","Data":"cb767248ccfd77799dd87e241473a43ab6bb6578eed463e48c8a2dd9b8ccbb0e"} Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.926189 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6","Type":"ContainerDied","Data":"201f66382bd2c37ccaf018b95f85268f5934e47f96fa069a0063c630e6c149b9"} Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.926205 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6","Type":"ContainerDied","Data":"bdc9ab6a2a4c33a523de1062182969c87d2c337091f2ab32f1ddb8cef08087fe"} Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.926240 4757 scope.go:117] "RemoveContainer" containerID="cb767248ccfd77799dd87e241473a43ab6bb6578eed463e48c8a2dd9b8ccbb0e" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.959620 4757 scope.go:117] "RemoveContainer" containerID="201f66382bd2c37ccaf018b95f85268f5934e47f96fa069a0063c630e6c149b9" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.983723 4757 scope.go:117] "RemoveContainer" containerID="cb767248ccfd77799dd87e241473a43ab6bb6578eed463e48c8a2dd9b8ccbb0e" Dec 16 13:11:01 crc kubenswrapper[4757]: E1216 13:11:01.984259 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb767248ccfd77799dd87e241473a43ab6bb6578eed463e48c8a2dd9b8ccbb0e\": container with ID starting with cb767248ccfd77799dd87e241473a43ab6bb6578eed463e48c8a2dd9b8ccbb0e not found: ID does not exist" containerID="cb767248ccfd77799dd87e241473a43ab6bb6578eed463e48c8a2dd9b8ccbb0e" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.984293 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb767248ccfd77799dd87e241473a43ab6bb6578eed463e48c8a2dd9b8ccbb0e"} err="failed to get container status \"cb767248ccfd77799dd87e241473a43ab6bb6578eed463e48c8a2dd9b8ccbb0e\": rpc error: code = NotFound desc = could not find container \"cb767248ccfd77799dd87e241473a43ab6bb6578eed463e48c8a2dd9b8ccbb0e\": container with ID starting with cb767248ccfd77799dd87e241473a43ab6bb6578eed463e48c8a2dd9b8ccbb0e not found: ID does not exist" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.984315 4757 scope.go:117] "RemoveContainer" containerID="201f66382bd2c37ccaf018b95f85268f5934e47f96fa069a0063c630e6c149b9" Dec 16 13:11:01 crc kubenswrapper[4757]: E1216 13:11:01.988266 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"201f66382bd2c37ccaf018b95f85268f5934e47f96fa069a0063c630e6c149b9\": container with ID starting with 201f66382bd2c37ccaf018b95f85268f5934e47f96fa069a0063c630e6c149b9 not found: ID does not exist" containerID="201f66382bd2c37ccaf018b95f85268f5934e47f96fa069a0063c630e6c149b9" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.988345 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"201f66382bd2c37ccaf018b95f85268f5934e47f96fa069a0063c630e6c149b9"} err="failed to get container status \"201f66382bd2c37ccaf018b95f85268f5934e47f96fa069a0063c630e6c149b9\": rpc error: code = NotFound desc = could not find container \"201f66382bd2c37ccaf018b95f85268f5934e47f96fa069a0063c630e6c149b9\": container with ID starting with 201f66382bd2c37ccaf018b95f85268f5934e47f96fa069a0063c630e6c149b9 not found: ID does not exist" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.988397 4757 scope.go:117] "RemoveContainer" containerID="cb767248ccfd77799dd87e241473a43ab6bb6578eed463e48c8a2dd9b8ccbb0e" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.990846 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb767248ccfd77799dd87e241473a43ab6bb6578eed463e48c8a2dd9b8ccbb0e"} err="failed to get container status \"cb767248ccfd77799dd87e241473a43ab6bb6578eed463e48c8a2dd9b8ccbb0e\": rpc error: code = NotFound desc = could not find container \"cb767248ccfd77799dd87e241473a43ab6bb6578eed463e48c8a2dd9b8ccbb0e\": container with ID starting with cb767248ccfd77799dd87e241473a43ab6bb6578eed463e48c8a2dd9b8ccbb0e not found: ID does not exist" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.990893 4757 scope.go:117] "RemoveContainer" containerID="201f66382bd2c37ccaf018b95f85268f5934e47f96fa069a0063c630e6c149b9" Dec 16 13:11:01 crc kubenswrapper[4757]: I1216 13:11:01.991209 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"201f66382bd2c37ccaf018b95f85268f5934e47f96fa069a0063c630e6c149b9"} err="failed to get container status \"201f66382bd2c37ccaf018b95f85268f5934e47f96fa069a0063c630e6c149b9\": rpc error: code = NotFound desc = could not find container \"201f66382bd2c37ccaf018b95f85268f5934e47f96fa069a0063c630e6c149b9\": container with ID starting with 201f66382bd2c37ccaf018b95f85268f5934e47f96fa069a0063c630e6c149b9 not found: ID does not exist" Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.024365 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.040443 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-config-data\") pod \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\" (UID: \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\") " Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.040736 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-combined-ca-bundle\") pod \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\" (UID: \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\") " Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.040808 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdc26\" (UniqueName: \"kubernetes.io/projected/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-kube-api-access-mdc26\") pod \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\" (UID: \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\") " Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.040859 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-logs\") pod \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\" (UID: \"fc3b1969-e4b8-425a-af4a-a2ec019ff7e6\") " Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.043221 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-logs" (OuterVolumeSpecName: "logs") pod "fc3b1969-e4b8-425a-af4a-a2ec019ff7e6" (UID: "fc3b1969-e4b8-425a-af4a-a2ec019ff7e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.045197 4757 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-logs\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.051130 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-kube-api-access-mdc26" (OuterVolumeSpecName: "kube-api-access-mdc26") pod "fc3b1969-e4b8-425a-af4a-a2ec019ff7e6" (UID: "fc3b1969-e4b8-425a-af4a-a2ec019ff7e6"). InnerVolumeSpecName "kube-api-access-mdc26". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.151740 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdc26\" (UniqueName: \"kubernetes.io/projected/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-kube-api-access-mdc26\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.496700 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc3b1969-e4b8-425a-af4a-a2ec019ff7e6" (UID: "fc3b1969-e4b8-425a-af4a-a2ec019ff7e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.497133 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-config-data" (OuterVolumeSpecName: "config-data") pod "fc3b1969-e4b8-425a-af4a-a2ec019ff7e6" (UID: "fc3b1969-e4b8-425a-af4a-a2ec019ff7e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.557878 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.558211 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerName="ceilometer-central-agent" containerID="cri-o://efb89a168722bcb14295498f72c5a979cfc30cef53986f772a8d062e33ece741" gracePeriod=30 Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.558864 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerName="ceilometer-notification-agent" containerID="cri-o://918217cd1d1408090d4aa8975a6c182e8dad824497dcdbc2f1a90f19e2da4c26" gracePeriod=30 Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.559079 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerName="proxy-httpd" containerID="cri-o://89d66b8088206c9ce386d6c0b3b29c613c82d13e1673d80000c75c1336a20ceb" gracePeriod=30 Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.560804 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.560836 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.560873 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerName="sg-core" containerID="cri-o://74c6d1460b7105d99b13b5487f82664d050b44bab32e6ec47dd7f4895078d96a" gracePeriod=30 Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.937617 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.939390 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2457fa41-c003-450d-a55e-f67c36155f94","Type":"ContainerStarted","Data":"380a7f522bc9d785923ea32033de697a58f3f1d26ccf0fd764283fdaf9e71591"} Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.939444 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2457fa41-c003-450d-a55e-f67c36155f94","Type":"ContainerStarted","Data":"c52f2d43fb9903c16c8a07a233ef361351640f416526ffe821ea988c79586516"} Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.939540 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.943123 4757 generic.go:334] "Generic (PLEG): container finished" podID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerID="89d66b8088206c9ce386d6c0b3b29c613c82d13e1673d80000c75c1336a20ceb" exitCode=0 Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.943148 4757 generic.go:334] "Generic (PLEG): container finished" podID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerID="74c6d1460b7105d99b13b5487f82664d050b44bab32e6ec47dd7f4895078d96a" exitCode=2 Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.943169 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02eab243-e69e-4823-bc6f-2e6b70d5c80d","Type":"ContainerDied","Data":"89d66b8088206c9ce386d6c0b3b29c613c82d13e1673d80000c75c1336a20ceb"} Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.943198 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02eab243-e69e-4823-bc6f-2e6b70d5c80d","Type":"ContainerDied","Data":"74c6d1460b7105d99b13b5487f82664d050b44bab32e6ec47dd7f4895078d96a"} Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.967321 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.475224384 podStartE2EDuration="2.967300358s" podCreationTimestamp="2025-12-16 13:11:00 +0000 UTC" firstStartedPulling="2025-12-16 13:11:02.036728008 +0000 UTC m=+1447.464471804" lastFinishedPulling="2025-12-16 13:11:02.528803982 +0000 UTC m=+1447.956547778" observedRunningTime="2025-12-16 13:11:02.962662033 +0000 UTC m=+1448.390405829" watchObservedRunningTime="2025-12-16 13:11:02.967300358 +0000 UTC m=+1448.395044154" Dec 16 13:11:02 crc kubenswrapper[4757]: I1216 13:11:02.994597 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.007078 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.022376 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:11:03 crc kubenswrapper[4757]: E1216 13:11:03.022820 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3b1969-e4b8-425a-af4a-a2ec019ff7e6" containerName="nova-metadata-metadata" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.022840 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3b1969-e4b8-425a-af4a-a2ec019ff7e6" containerName="nova-metadata-metadata" Dec 16 13:11:03 crc kubenswrapper[4757]: E1216 13:11:03.022865 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3b1969-e4b8-425a-af4a-a2ec019ff7e6" containerName="nova-metadata-log" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.022871 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3b1969-e4b8-425a-af4a-a2ec019ff7e6" containerName="nova-metadata-log" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.023043 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc3b1969-e4b8-425a-af4a-a2ec019ff7e6" containerName="nova-metadata-log" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.023064 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc3b1969-e4b8-425a-af4a-a2ec019ff7e6" containerName="nova-metadata-metadata" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.024191 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.028227 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.028490 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.033952 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.072543 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34dae10c-f09b-433d-8fb4-b218bddb1fb4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " pod="openstack/nova-metadata-0" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.072622 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34dae10c-f09b-433d-8fb4-b218bddb1fb4-config-data\") pod \"nova-metadata-0\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " pod="openstack/nova-metadata-0" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.072672 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/34dae10c-f09b-433d-8fb4-b218bddb1fb4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " pod="openstack/nova-metadata-0" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.072760 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxn7s\" (UniqueName: \"kubernetes.io/projected/34dae10c-f09b-433d-8fb4-b218bddb1fb4-kube-api-access-xxn7s\") pod \"nova-metadata-0\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " pod="openstack/nova-metadata-0" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.072811 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34dae10c-f09b-433d-8fb4-b218bddb1fb4-logs\") pod \"nova-metadata-0\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " pod="openstack/nova-metadata-0" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.174213 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34dae10c-f09b-433d-8fb4-b218bddb1fb4-logs\") pod \"nova-metadata-0\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " pod="openstack/nova-metadata-0" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.174307 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34dae10c-f09b-433d-8fb4-b218bddb1fb4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " pod="openstack/nova-metadata-0" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.174333 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34dae10c-f09b-433d-8fb4-b218bddb1fb4-config-data\") pod \"nova-metadata-0\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " pod="openstack/nova-metadata-0" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.174371 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/34dae10c-f09b-433d-8fb4-b218bddb1fb4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " pod="openstack/nova-metadata-0" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.174439 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxn7s\" (UniqueName: \"kubernetes.io/projected/34dae10c-f09b-433d-8fb4-b218bddb1fb4-kube-api-access-xxn7s\") pod \"nova-metadata-0\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " pod="openstack/nova-metadata-0" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.175684 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34dae10c-f09b-433d-8fb4-b218bddb1fb4-logs\") pod \"nova-metadata-0\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " pod="openstack/nova-metadata-0" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.179492 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34dae10c-f09b-433d-8fb4-b218bddb1fb4-config-data\") pod \"nova-metadata-0\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " pod="openstack/nova-metadata-0" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.183810 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34dae10c-f09b-433d-8fb4-b218bddb1fb4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " pod="openstack/nova-metadata-0" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.188095 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/34dae10c-f09b-433d-8fb4-b218bddb1fb4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " pod="openstack/nova-metadata-0" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.197136 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxn7s\" (UniqueName: \"kubernetes.io/projected/34dae10c-f09b-433d-8fb4-b218bddb1fb4-kube-api-access-xxn7s\") pod \"nova-metadata-0\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " pod="openstack/nova-metadata-0" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.345260 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.870134 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.959599 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"34dae10c-f09b-433d-8fb4-b218bddb1fb4","Type":"ContainerStarted","Data":"7d975329145fced263f1887d187296ac0f1f4803093af5ecb09230d45b05a73d"} Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.963354 4757 generic.go:334] "Generic (PLEG): container finished" podID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerID="efb89a168722bcb14295498f72c5a979cfc30cef53986f772a8d062e33ece741" exitCode=0 Dec 16 13:11:03 crc kubenswrapper[4757]: I1216 13:11:03.963434 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02eab243-e69e-4823-bc6f-2e6b70d5c80d","Type":"ContainerDied","Data":"efb89a168722bcb14295498f72c5a979cfc30cef53986f772a8d062e33ece741"} Dec 16 13:11:04 crc kubenswrapper[4757]: I1216 13:11:04.079222 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 13:11:04 crc kubenswrapper[4757]: I1216 13:11:04.079486 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 13:11:04 crc kubenswrapper[4757]: I1216 13:11:04.436861 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:04 crc kubenswrapper[4757]: I1216 13:11:04.445362 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 13:11:04 crc kubenswrapper[4757]: I1216 13:11:04.446617 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 13:11:04 crc kubenswrapper[4757]: I1216 13:11:04.501410 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 13:11:04 crc kubenswrapper[4757]: I1216 13:11:04.800185 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:11:04 crc kubenswrapper[4757]: I1216 13:11:04.901333 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-drpwc"] Dec 16 13:11:04 crc kubenswrapper[4757]: I1216 13:11:04.901939 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-drpwc" podUID="5c88d603-6fdd-446a-a46c-990d30bacb6c" containerName="dnsmasq-dns" containerID="cri-o://94e6c3bc2979f5662ca293d51a8a7ca0bf30fe102fe76030e47ac6f7da4b0a31" gracePeriod=10 Dec 16 13:11:04 crc kubenswrapper[4757]: I1216 13:11:04.972761 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc3b1969-e4b8-425a-af4a-a2ec019ff7e6" path="/var/lib/kubelet/pods/fc3b1969-e4b8-425a-af4a-a2ec019ff7e6/volumes" Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.005834 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"34dae10c-f09b-433d-8fb4-b218bddb1fb4","Type":"ContainerStarted","Data":"eab31abc53465cb643d676123d942f87bcad36c8f1982932232dd68736d45edf"} Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.005891 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"34dae10c-f09b-433d-8fb4-b218bddb1fb4","Type":"ContainerStarted","Data":"d33e064fe84afbe39763af02540fa8b80b5c94333c03df3c8cfd4f52ca2d1d86"} Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.071985 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.071948103 podStartE2EDuration="3.071948103s" podCreationTimestamp="2025-12-16 13:11:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:11:05.044729998 +0000 UTC m=+1450.472473814" watchObservedRunningTime="2025-12-16 13:11:05.071948103 +0000 UTC m=+1450.499691899" Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.089156 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.170698 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8d35ab5a-645f-40c9-ba7a-288a5ed7722a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.170853 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8d35ab5a-645f-40c9-ba7a-288a5ed7722a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.571754 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.758217 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-ovsdbserver-sb\") pod \"5c88d603-6fdd-446a-a46c-990d30bacb6c\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.758598 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8xfj\" (UniqueName: \"kubernetes.io/projected/5c88d603-6fdd-446a-a46c-990d30bacb6c-kube-api-access-k8xfj\") pod \"5c88d603-6fdd-446a-a46c-990d30bacb6c\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.758695 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-ovsdbserver-nb\") pod \"5c88d603-6fdd-446a-a46c-990d30bacb6c\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.758738 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-dns-swift-storage-0\") pod \"5c88d603-6fdd-446a-a46c-990d30bacb6c\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.758773 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-dns-svc\") pod \"5c88d603-6fdd-446a-a46c-990d30bacb6c\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.758806 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-config\") pod \"5c88d603-6fdd-446a-a46c-990d30bacb6c\" (UID: \"5c88d603-6fdd-446a-a46c-990d30bacb6c\") " Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.787445 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c88d603-6fdd-446a-a46c-990d30bacb6c-kube-api-access-k8xfj" (OuterVolumeSpecName: "kube-api-access-k8xfj") pod "5c88d603-6fdd-446a-a46c-990d30bacb6c" (UID: "5c88d603-6fdd-446a-a46c-990d30bacb6c"). InnerVolumeSpecName "kube-api-access-k8xfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.861466 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8xfj\" (UniqueName: \"kubernetes.io/projected/5c88d603-6fdd-446a-a46c-990d30bacb6c-kube-api-access-k8xfj\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.863524 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5c88d603-6fdd-446a-a46c-990d30bacb6c" (UID: "5c88d603-6fdd-446a-a46c-990d30bacb6c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.868444 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5c88d603-6fdd-446a-a46c-990d30bacb6c" (UID: "5c88d603-6fdd-446a-a46c-990d30bacb6c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.884269 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5c88d603-6fdd-446a-a46c-990d30bacb6c" (UID: "5c88d603-6fdd-446a-a46c-990d30bacb6c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.926322 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-config" (OuterVolumeSpecName: "config") pod "5c88d603-6fdd-446a-a46c-990d30bacb6c" (UID: "5c88d603-6fdd-446a-a46c-990d30bacb6c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.944434 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5c88d603-6fdd-446a-a46c-990d30bacb6c" (UID: "5c88d603-6fdd-446a-a46c-990d30bacb6c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.965152 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.965442 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.965529 4757 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.965606 4757 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:05 crc kubenswrapper[4757]: I1216 13:11:05.965676 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c88d603-6fdd-446a-a46c-990d30bacb6c-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:06 crc kubenswrapper[4757]: I1216 13:11:06.025460 4757 generic.go:334] "Generic (PLEG): container finished" podID="5c88d603-6fdd-446a-a46c-990d30bacb6c" containerID="94e6c3bc2979f5662ca293d51a8a7ca0bf30fe102fe76030e47ac6f7da4b0a31" exitCode=0 Dec 16 13:11:06 crc kubenswrapper[4757]: I1216 13:11:06.025945 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-drpwc" event={"ID":"5c88d603-6fdd-446a-a46c-990d30bacb6c","Type":"ContainerDied","Data":"94e6c3bc2979f5662ca293d51a8a7ca0bf30fe102fe76030e47ac6f7da4b0a31"} Dec 16 13:11:06 crc kubenswrapper[4757]: I1216 13:11:06.026113 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-drpwc" event={"ID":"5c88d603-6fdd-446a-a46c-990d30bacb6c","Type":"ContainerDied","Data":"3195e0b4375087b59f05398ee7b152c8b6b8cc610e1641a9bde9cb141e378bc4"} Dec 16 13:11:06 crc kubenswrapper[4757]: I1216 13:11:06.026219 4757 scope.go:117] "RemoveContainer" containerID="94e6c3bc2979f5662ca293d51a8a7ca0bf30fe102fe76030e47ac6f7da4b0a31" Dec 16 13:11:06 crc kubenswrapper[4757]: I1216 13:11:06.026461 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-drpwc" Dec 16 13:11:06 crc kubenswrapper[4757]: I1216 13:11:06.090556 4757 scope.go:117] "RemoveContainer" containerID="f85f01c36d09719740a0b3caae5f88bff12825cabc5202009aa6e275af7d2e67" Dec 16 13:11:06 crc kubenswrapper[4757]: I1216 13:11:06.099999 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-drpwc"] Dec 16 13:11:06 crc kubenswrapper[4757]: I1216 13:11:06.157776 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-drpwc"] Dec 16 13:11:06 crc kubenswrapper[4757]: I1216 13:11:06.212262 4757 scope.go:117] "RemoveContainer" containerID="94e6c3bc2979f5662ca293d51a8a7ca0bf30fe102fe76030e47ac6f7da4b0a31" Dec 16 13:11:06 crc kubenswrapper[4757]: E1216 13:11:06.213146 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94e6c3bc2979f5662ca293d51a8a7ca0bf30fe102fe76030e47ac6f7da4b0a31\": container with ID starting with 94e6c3bc2979f5662ca293d51a8a7ca0bf30fe102fe76030e47ac6f7da4b0a31 not found: ID does not exist" containerID="94e6c3bc2979f5662ca293d51a8a7ca0bf30fe102fe76030e47ac6f7da4b0a31" Dec 16 13:11:06 crc kubenswrapper[4757]: I1216 13:11:06.213182 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e6c3bc2979f5662ca293d51a8a7ca0bf30fe102fe76030e47ac6f7da4b0a31"} err="failed to get container status \"94e6c3bc2979f5662ca293d51a8a7ca0bf30fe102fe76030e47ac6f7da4b0a31\": rpc error: code = NotFound desc = could not find container \"94e6c3bc2979f5662ca293d51a8a7ca0bf30fe102fe76030e47ac6f7da4b0a31\": container with ID starting with 94e6c3bc2979f5662ca293d51a8a7ca0bf30fe102fe76030e47ac6f7da4b0a31 not found: ID does not exist" Dec 16 13:11:06 crc kubenswrapper[4757]: I1216 13:11:06.213206 4757 scope.go:117] "RemoveContainer" containerID="f85f01c36d09719740a0b3caae5f88bff12825cabc5202009aa6e275af7d2e67" Dec 16 13:11:06 crc kubenswrapper[4757]: E1216 13:11:06.215925 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f85f01c36d09719740a0b3caae5f88bff12825cabc5202009aa6e275af7d2e67\": container with ID starting with f85f01c36d09719740a0b3caae5f88bff12825cabc5202009aa6e275af7d2e67 not found: ID does not exist" containerID="f85f01c36d09719740a0b3caae5f88bff12825cabc5202009aa6e275af7d2e67" Dec 16 13:11:06 crc kubenswrapper[4757]: I1216 13:11:06.215964 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f85f01c36d09719740a0b3caae5f88bff12825cabc5202009aa6e275af7d2e67"} err="failed to get container status \"f85f01c36d09719740a0b3caae5f88bff12825cabc5202009aa6e275af7d2e67\": rpc error: code = NotFound desc = could not find container \"f85f01c36d09719740a0b3caae5f88bff12825cabc5202009aa6e275af7d2e67\": container with ID starting with f85f01c36d09719740a0b3caae5f88bff12825cabc5202009aa6e275af7d2e67 not found: ID does not exist" Dec 16 13:11:06 crc kubenswrapper[4757]: I1216 13:11:06.971847 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c88d603-6fdd-446a-a46c-990d30bacb6c" path="/var/lib/kubelet/pods/5c88d603-6fdd-446a-a46c-990d30bacb6c/volumes" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.060603 4757 generic.go:334] "Generic (PLEG): container finished" podID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerID="918217cd1d1408090d4aa8975a6c182e8dad824497dcdbc2f1a90f19e2da4c26" exitCode=0 Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.060664 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02eab243-e69e-4823-bc6f-2e6b70d5c80d","Type":"ContainerDied","Data":"918217cd1d1408090d4aa8975a6c182e8dad824497dcdbc2f1a90f19e2da4c26"} Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.064806 4757 generic.go:334] "Generic (PLEG): container finished" podID="399f2693-64b1-4958-ad75-49c45b448ed5" containerID="27930a46a589b4d3638de24e488861d0ce2d79305f0f2fcc652982d005f8b8df" exitCode=137 Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.065073 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75ccc7d896-jmrk9" event={"ID":"399f2693-64b1-4958-ad75-49c45b448ed5","Type":"ContainerDied","Data":"27930a46a589b4d3638de24e488861d0ce2d79305f0f2fcc652982d005f8b8df"} Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.615139 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.625040 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.711861 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-scripts\") pod \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.711934 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7g7c\" (UniqueName: \"kubernetes.io/projected/02eab243-e69e-4823-bc6f-2e6b70d5c80d-kube-api-access-c7g7c\") pod \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.712071 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-sg-core-conf-yaml\") pod \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.712198 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02eab243-e69e-4823-bc6f-2e6b70d5c80d-log-httpd\") pod \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.712227 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02eab243-e69e-4823-bc6f-2e6b70d5c80d-run-httpd\") pod \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.712281 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-combined-ca-bundle\") pod \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.712357 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-config-data\") pod \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\" (UID: \"02eab243-e69e-4823-bc6f-2e6b70d5c80d\") " Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.715143 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02eab243-e69e-4823-bc6f-2e6b70d5c80d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "02eab243-e69e-4823-bc6f-2e6b70d5c80d" (UID: "02eab243-e69e-4823-bc6f-2e6b70d5c80d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.716604 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02eab243-e69e-4823-bc6f-2e6b70d5c80d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "02eab243-e69e-4823-bc6f-2e6b70d5c80d" (UID: "02eab243-e69e-4823-bc6f-2e6b70d5c80d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.760219 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-scripts" (OuterVolumeSpecName: "scripts") pod "02eab243-e69e-4823-bc6f-2e6b70d5c80d" (UID: "02eab243-e69e-4823-bc6f-2e6b70d5c80d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.760582 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02eab243-e69e-4823-bc6f-2e6b70d5c80d-kube-api-access-c7g7c" (OuterVolumeSpecName: "kube-api-access-c7g7c") pod "02eab243-e69e-4823-bc6f-2e6b70d5c80d" (UID: "02eab243-e69e-4823-bc6f-2e6b70d5c80d"). InnerVolumeSpecName "kube-api-access-c7g7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.776289 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "02eab243-e69e-4823-bc6f-2e6b70d5c80d" (UID: "02eab243-e69e-4823-bc6f-2e6b70d5c80d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.814498 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/399f2693-64b1-4958-ad75-49c45b448ed5-horizon-secret-key\") pod \"399f2693-64b1-4958-ad75-49c45b448ed5\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.814658 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399f2693-64b1-4958-ad75-49c45b448ed5-combined-ca-bundle\") pod \"399f2693-64b1-4958-ad75-49c45b448ed5\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.814699 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/399f2693-64b1-4958-ad75-49c45b448ed5-logs\") pod \"399f2693-64b1-4958-ad75-49c45b448ed5\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.814733 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/399f2693-64b1-4958-ad75-49c45b448ed5-scripts\") pod \"399f2693-64b1-4958-ad75-49c45b448ed5\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.814790 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/399f2693-64b1-4958-ad75-49c45b448ed5-horizon-tls-certs\") pod \"399f2693-64b1-4958-ad75-49c45b448ed5\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.814822 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/399f2693-64b1-4958-ad75-49c45b448ed5-config-data\") pod \"399f2693-64b1-4958-ad75-49c45b448ed5\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.814932 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dqnx\" (UniqueName: \"kubernetes.io/projected/399f2693-64b1-4958-ad75-49c45b448ed5-kube-api-access-2dqnx\") pod \"399f2693-64b1-4958-ad75-49c45b448ed5\" (UID: \"399f2693-64b1-4958-ad75-49c45b448ed5\") " Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.815153 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/399f2693-64b1-4958-ad75-49c45b448ed5-logs" (OuterVolumeSpecName: "logs") pod "399f2693-64b1-4958-ad75-49c45b448ed5" (UID: "399f2693-64b1-4958-ad75-49c45b448ed5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.815472 4757 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02eab243-e69e-4823-bc6f-2e6b70d5c80d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.815492 4757 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02eab243-e69e-4823-bc6f-2e6b70d5c80d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.815503 4757 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/399f2693-64b1-4958-ad75-49c45b448ed5-logs\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.815514 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.815525 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7g7c\" (UniqueName: \"kubernetes.io/projected/02eab243-e69e-4823-bc6f-2e6b70d5c80d-kube-api-access-c7g7c\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.815536 4757 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.833295 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/399f2693-64b1-4958-ad75-49c45b448ed5-kube-api-access-2dqnx" (OuterVolumeSpecName: "kube-api-access-2dqnx") pod "399f2693-64b1-4958-ad75-49c45b448ed5" (UID: "399f2693-64b1-4958-ad75-49c45b448ed5"). InnerVolumeSpecName "kube-api-access-2dqnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.859529 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/399f2693-64b1-4958-ad75-49c45b448ed5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "399f2693-64b1-4958-ad75-49c45b448ed5" (UID: "399f2693-64b1-4958-ad75-49c45b448ed5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.878808 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/399f2693-64b1-4958-ad75-49c45b448ed5-config-data" (OuterVolumeSpecName: "config-data") pod "399f2693-64b1-4958-ad75-49c45b448ed5" (UID: "399f2693-64b1-4958-ad75-49c45b448ed5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.900282 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/399f2693-64b1-4958-ad75-49c45b448ed5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "399f2693-64b1-4958-ad75-49c45b448ed5" (UID: "399f2693-64b1-4958-ad75-49c45b448ed5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.917431 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dqnx\" (UniqueName: \"kubernetes.io/projected/399f2693-64b1-4958-ad75-49c45b448ed5-kube-api-access-2dqnx\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.919250 4757 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/399f2693-64b1-4958-ad75-49c45b448ed5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.919385 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399f2693-64b1-4958-ad75-49c45b448ed5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.919476 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/399f2693-64b1-4958-ad75-49c45b448ed5-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.921502 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/399f2693-64b1-4958-ad75-49c45b448ed5-scripts" (OuterVolumeSpecName: "scripts") pod "399f2693-64b1-4958-ad75-49c45b448ed5" (UID: "399f2693-64b1-4958-ad75-49c45b448ed5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.931870 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02eab243-e69e-4823-bc6f-2e6b70d5c80d" (UID: "02eab243-e69e-4823-bc6f-2e6b70d5c80d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.941480 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/399f2693-64b1-4958-ad75-49c45b448ed5-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "399f2693-64b1-4958-ad75-49c45b448ed5" (UID: "399f2693-64b1-4958-ad75-49c45b448ed5"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:07 crc kubenswrapper[4757]: I1216 13:11:07.963548 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-config-data" (OuterVolumeSpecName: "config-data") pod "02eab243-e69e-4823-bc6f-2e6b70d5c80d" (UID: "02eab243-e69e-4823-bc6f-2e6b70d5c80d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.022027 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.022711 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/399f2693-64b1-4958-ad75-49c45b448ed5-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.022850 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02eab243-e69e-4823-bc6f-2e6b70d5c80d-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.022980 4757 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/399f2693-64b1-4958-ad75-49c45b448ed5-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.075315 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75ccc7d896-jmrk9" event={"ID":"399f2693-64b1-4958-ad75-49c45b448ed5","Type":"ContainerDied","Data":"9a841203e84fb18d5fece40a98bea67abcafaf0ddc9e93031e6806f863f1c521"} Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.075588 4757 scope.go:117] "RemoveContainer" containerID="c796a6497876b25321bec8486b437aa7c77c35a00f83ae805bc05496ed54f2cc" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.075685 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75ccc7d896-jmrk9" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.078922 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02eab243-e69e-4823-bc6f-2e6b70d5c80d","Type":"ContainerDied","Data":"b76bcae88aac804c6bfb5024d5cbcb4f0660c13b087862a535e6a4ed551057e4"} Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.079185 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.112200 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75ccc7d896-jmrk9"] Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.124380 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-75ccc7d896-jmrk9"] Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.141776 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.154468 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.167955 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:11:08 crc kubenswrapper[4757]: E1216 13:11:08.168560 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerName="ceilometer-notification-agent" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.168640 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerName="ceilometer-notification-agent" Dec 16 13:11:08 crc kubenswrapper[4757]: E1216 13:11:08.168697 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerName="ceilometer-central-agent" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.168747 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerName="ceilometer-central-agent" Dec 16 13:11:08 crc kubenswrapper[4757]: E1216 13:11:08.168804 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerName="proxy-httpd" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.168855 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerName="proxy-httpd" Dec 16 13:11:08 crc kubenswrapper[4757]: E1216 13:11:08.168918 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon-log" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.168985 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon-log" Dec 16 13:11:08 crc kubenswrapper[4757]: E1216 13:11:08.169085 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c88d603-6fdd-446a-a46c-990d30bacb6c" containerName="init" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.169146 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c88d603-6fdd-446a-a46c-990d30bacb6c" containerName="init" Dec 16 13:11:08 crc kubenswrapper[4757]: E1216 13:11:08.169232 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.169313 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" Dec 16 13:11:08 crc kubenswrapper[4757]: E1216 13:11:08.169397 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerName="sg-core" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.169475 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerName="sg-core" Dec 16 13:11:08 crc kubenswrapper[4757]: E1216 13:11:08.169542 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c88d603-6fdd-446a-a46c-990d30bacb6c" containerName="dnsmasq-dns" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.169594 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c88d603-6fdd-446a-a46c-990d30bacb6c" containerName="dnsmasq-dns" Dec 16 13:11:08 crc kubenswrapper[4757]: E1216 13:11:08.169646 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.169702 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" Dec 16 13:11:08 crc kubenswrapper[4757]: E1216 13:11:08.169827 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.169894 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.170236 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerName="proxy-httpd" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.170311 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerName="ceilometer-notification-agent" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.170475 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerName="ceilometer-central-agent" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.170594 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon-log" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.170659 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.170722 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.170784 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c88d603-6fdd-446a-a46c-990d30bacb6c" containerName="dnsmasq-dns" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.170839 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" containerName="horizon" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.170898 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" containerName="sg-core" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.172652 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.185946 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.186791 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.188169 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.188755 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.327933 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpmpn\" (UniqueName: \"kubernetes.io/projected/d1136893-32fb-41f6-97cb-466c3819677e-kube-api-access-xpmpn\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.328242 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-config-data\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.328332 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1136893-32fb-41f6-97cb-466c3819677e-run-httpd\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.328411 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-scripts\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.328483 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.328588 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.328696 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1136893-32fb-41f6-97cb-466c3819677e-log-httpd\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.328814 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.345539 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.345633 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.431147 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1136893-32fb-41f6-97cb-466c3819677e-log-httpd\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.431259 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.431410 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpmpn\" (UniqueName: \"kubernetes.io/projected/d1136893-32fb-41f6-97cb-466c3819677e-kube-api-access-xpmpn\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.431454 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-config-data\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.431484 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1136893-32fb-41f6-97cb-466c3819677e-run-httpd\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.431505 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-scripts\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.431529 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.431571 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.431771 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1136893-32fb-41f6-97cb-466c3819677e-log-httpd\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.432123 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1136893-32fb-41f6-97cb-466c3819677e-run-httpd\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.436367 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-scripts\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.436803 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.437901 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.438107 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.463895 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-config-data\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.488288 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpmpn\" (UniqueName: \"kubernetes.io/projected/d1136893-32fb-41f6-97cb-466c3819677e-kube-api-access-xpmpn\") pod \"ceilometer-0\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.503452 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.537521 4757 scope.go:117] "RemoveContainer" containerID="27930a46a589b4d3638de24e488861d0ce2d79305f0f2fcc652982d005f8b8df" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.577391 4757 scope.go:117] "RemoveContainer" containerID="89d66b8088206c9ce386d6c0b3b29c613c82d13e1673d80000c75c1336a20ceb" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.732739 4757 scope.go:117] "RemoveContainer" containerID="74c6d1460b7105d99b13b5487f82664d050b44bab32e6ec47dd7f4895078d96a" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.779620 4757 scope.go:117] "RemoveContainer" containerID="918217cd1d1408090d4aa8975a6c182e8dad824497dcdbc2f1a90f19e2da4c26" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.843019 4757 scope.go:117] "RemoveContainer" containerID="efb89a168722bcb14295498f72c5a979cfc30cef53986f772a8d062e33ece741" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.960181 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02eab243-e69e-4823-bc6f-2e6b70d5c80d" path="/var/lib/kubelet/pods/02eab243-e69e-4823-bc6f-2e6b70d5c80d/volumes" Dec 16 13:11:08 crc kubenswrapper[4757]: I1216 13:11:08.961166 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="399f2693-64b1-4958-ad75-49c45b448ed5" path="/var/lib/kubelet/pods/399f2693-64b1-4958-ad75-49c45b448ed5/volumes" Dec 16 13:11:09 crc kubenswrapper[4757]: I1216 13:11:09.104276 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:11:10 crc kubenswrapper[4757]: I1216 13:11:10.165997 4757 generic.go:334] "Generic (PLEG): container finished" podID="32ea5e67-160d-47fd-9bb3-70141a4bcdb1" containerID="bb5bac903cac19e8af0a42238e02f346db20cfc7fc52e6187c10601921713636" exitCode=0 Dec 16 13:11:10 crc kubenswrapper[4757]: I1216 13:11:10.166227 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-b4m4q" event={"ID":"32ea5e67-160d-47fd-9bb3-70141a4bcdb1","Type":"ContainerDied","Data":"bb5bac903cac19e8af0a42238e02f346db20cfc7fc52e6187c10601921713636"} Dec 16 13:11:10 crc kubenswrapper[4757]: I1216 13:11:10.172078 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1136893-32fb-41f6-97cb-466c3819677e","Type":"ContainerStarted","Data":"7b4f63ec544be84607e13fca6f7af182fff198df1412bad821dfcf5fb71b7036"} Dec 16 13:11:10 crc kubenswrapper[4757]: I1216 13:11:10.172329 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1136893-32fb-41f6-97cb-466c3819677e","Type":"ContainerStarted","Data":"37b7f49bbbdc53559c7b19796c3bf27008270cf1cef18fed9bab2af69e02a064"} Dec 16 13:11:11 crc kubenswrapper[4757]: I1216 13:11:11.182374 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1136893-32fb-41f6-97cb-466c3819677e","Type":"ContainerStarted","Data":"5cc761176fb7cb480d4c7f47eef52e66a5c01b310b415871e17956c843c8039c"} Dec 16 13:11:11 crc kubenswrapper[4757]: I1216 13:11:11.185703 4757 generic.go:334] "Generic (PLEG): container finished" podID="1c6b1bba-b68a-4912-aada-0229a7152426" containerID="3de9e060dc0c56b825fb723b67ad28f9320965646c9a8d7934c870822718e4e7" exitCode=0 Dec 16 13:11:11 crc kubenswrapper[4757]: I1216 13:11:11.185784 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hwttf" event={"ID":"1c6b1bba-b68a-4912-aada-0229a7152426","Type":"ContainerDied","Data":"3de9e060dc0c56b825fb723b67ad28f9320965646c9a8d7934c870822718e4e7"} Dec 16 13:11:11 crc kubenswrapper[4757]: I1216 13:11:11.431145 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 16 13:11:11 crc kubenswrapper[4757]: I1216 13:11:11.576139 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-b4m4q" Dec 16 13:11:11 crc kubenswrapper[4757]: I1216 13:11:11.713061 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-config-data\") pod \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\" (UID: \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\") " Dec 16 13:11:11 crc kubenswrapper[4757]: I1216 13:11:11.713147 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-combined-ca-bundle\") pod \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\" (UID: \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\") " Dec 16 13:11:11 crc kubenswrapper[4757]: I1216 13:11:11.713196 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnptg\" (UniqueName: \"kubernetes.io/projected/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-kube-api-access-wnptg\") pod \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\" (UID: \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\") " Dec 16 13:11:11 crc kubenswrapper[4757]: I1216 13:11:11.713390 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-scripts\") pod \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\" (UID: \"32ea5e67-160d-47fd-9bb3-70141a4bcdb1\") " Dec 16 13:11:11 crc kubenswrapper[4757]: I1216 13:11:11.726614 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-kube-api-access-wnptg" (OuterVolumeSpecName: "kube-api-access-wnptg") pod "32ea5e67-160d-47fd-9bb3-70141a4bcdb1" (UID: "32ea5e67-160d-47fd-9bb3-70141a4bcdb1"). InnerVolumeSpecName "kube-api-access-wnptg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:11 crc kubenswrapper[4757]: I1216 13:11:11.727188 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-scripts" (OuterVolumeSpecName: "scripts") pod "32ea5e67-160d-47fd-9bb3-70141a4bcdb1" (UID: "32ea5e67-160d-47fd-9bb3-70141a4bcdb1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:11 crc kubenswrapper[4757]: I1216 13:11:11.744612 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-config-data" (OuterVolumeSpecName: "config-data") pod "32ea5e67-160d-47fd-9bb3-70141a4bcdb1" (UID: "32ea5e67-160d-47fd-9bb3-70141a4bcdb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:11 crc kubenswrapper[4757]: I1216 13:11:11.747093 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32ea5e67-160d-47fd-9bb3-70141a4bcdb1" (UID: "32ea5e67-160d-47fd-9bb3-70141a4bcdb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:11 crc kubenswrapper[4757]: I1216 13:11:11.815221 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:11 crc kubenswrapper[4757]: I1216 13:11:11.815256 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:11 crc kubenswrapper[4757]: I1216 13:11:11.815267 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnptg\" (UniqueName: \"kubernetes.io/projected/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-kube-api-access-wnptg\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:11 crc kubenswrapper[4757]: I1216 13:11:11.815276 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ea5e67-160d-47fd-9bb3-70141a4bcdb1-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.197034 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-b4m4q" event={"ID":"32ea5e67-160d-47fd-9bb3-70141a4bcdb1","Type":"ContainerDied","Data":"2c9b87277e05d98ef77a3269deea28f8da5d7549c900889243f4a1f6833d27c9"} Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.197074 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c9b87277e05d98ef77a3269deea28f8da5d7549c900889243f4a1f6833d27c9" Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.197085 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-b4m4q" Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.212979 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1136893-32fb-41f6-97cb-466c3819677e","Type":"ContainerStarted","Data":"a214e4af06ed04b0c7b5cee312f0e94154e14e4a18430420bf28f9a8248acad5"} Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.364901 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.365426 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8d35ab5a-645f-40c9-ba7a-288a5ed7722a" containerName="nova-api-log" containerID="cri-o://ff527dbfa546699e89e7186e5ed187a52a524af9029c37e3b88743cccc5c5789" gracePeriod=30 Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.365827 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8d35ab5a-645f-40c9-ba7a-288a5ed7722a" containerName="nova-api-api" containerID="cri-o://b2fe659da9b648e5cd0a279d9855d55482e43dd365f37bbf2c223891db85da32" gracePeriod=30 Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.392017 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.392274 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c8a8bd03-898d-43fb-84d1-700e8fff5a26" containerName="nova-scheduler-scheduler" containerID="cri-o://0c22707850366730ebe75dd0cfe5925f7c956dfb61016d4015be571d36fae94b" gracePeriod=30 Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.460662 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.460861 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="34dae10c-f09b-433d-8fb4-b218bddb1fb4" containerName="nova-metadata-log" containerID="cri-o://d33e064fe84afbe39763af02540fa8b80b5c94333c03df3c8cfd4f52ca2d1d86" gracePeriod=30 Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.461282 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="34dae10c-f09b-433d-8fb4-b218bddb1fb4" containerName="nova-metadata-metadata" containerID="cri-o://eab31abc53465cb643d676123d942f87bcad36c8f1982932232dd68736d45edf" gracePeriod=30 Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.647874 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hwttf" Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.736184 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c6b1bba-b68a-4912-aada-0229a7152426-config-data\") pod \"1c6b1bba-b68a-4912-aada-0229a7152426\" (UID: \"1c6b1bba-b68a-4912-aada-0229a7152426\") " Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.736428 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c6b1bba-b68a-4912-aada-0229a7152426-combined-ca-bundle\") pod \"1c6b1bba-b68a-4912-aada-0229a7152426\" (UID: \"1c6b1bba-b68a-4912-aada-0229a7152426\") " Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.741244 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgzdg\" (UniqueName: \"kubernetes.io/projected/1c6b1bba-b68a-4912-aada-0229a7152426-kube-api-access-vgzdg\") pod \"1c6b1bba-b68a-4912-aada-0229a7152426\" (UID: \"1c6b1bba-b68a-4912-aada-0229a7152426\") " Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.742366 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c6b1bba-b68a-4912-aada-0229a7152426-scripts\") pod \"1c6b1bba-b68a-4912-aada-0229a7152426\" (UID: \"1c6b1bba-b68a-4912-aada-0229a7152426\") " Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.760503 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c6b1bba-b68a-4912-aada-0229a7152426-kube-api-access-vgzdg" (OuterVolumeSpecName: "kube-api-access-vgzdg") pod "1c6b1bba-b68a-4912-aada-0229a7152426" (UID: "1c6b1bba-b68a-4912-aada-0229a7152426"). InnerVolumeSpecName "kube-api-access-vgzdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.773149 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c6b1bba-b68a-4912-aada-0229a7152426-scripts" (OuterVolumeSpecName: "scripts") pod "1c6b1bba-b68a-4912-aada-0229a7152426" (UID: "1c6b1bba-b68a-4912-aada-0229a7152426"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.809082 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c6b1bba-b68a-4912-aada-0229a7152426-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c6b1bba-b68a-4912-aada-0229a7152426" (UID: "1c6b1bba-b68a-4912-aada-0229a7152426"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.813450 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c6b1bba-b68a-4912-aada-0229a7152426-config-data" (OuterVolumeSpecName: "config-data") pod "1c6b1bba-b68a-4912-aada-0229a7152426" (UID: "1c6b1bba-b68a-4912-aada-0229a7152426"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.847800 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c6b1bba-b68a-4912-aada-0229a7152426-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.847834 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c6b1bba-b68a-4912-aada-0229a7152426-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.847844 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgzdg\" (UniqueName: \"kubernetes.io/projected/1c6b1bba-b68a-4912-aada-0229a7152426-kube-api-access-vgzdg\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:12 crc kubenswrapper[4757]: I1216 13:11:12.847852 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c6b1bba-b68a-4912-aada-0229a7152426-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.229128 4757 generic.go:334] "Generic (PLEG): container finished" podID="34dae10c-f09b-433d-8fb4-b218bddb1fb4" containerID="eab31abc53465cb643d676123d942f87bcad36c8f1982932232dd68736d45edf" exitCode=0 Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.229165 4757 generic.go:334] "Generic (PLEG): container finished" podID="34dae10c-f09b-433d-8fb4-b218bddb1fb4" containerID="d33e064fe84afbe39763af02540fa8b80b5c94333c03df3c8cfd4f52ca2d1d86" exitCode=143 Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.229215 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"34dae10c-f09b-433d-8fb4-b218bddb1fb4","Type":"ContainerDied","Data":"eab31abc53465cb643d676123d942f87bcad36c8f1982932232dd68736d45edf"} Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.229243 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"34dae10c-f09b-433d-8fb4-b218bddb1fb4","Type":"ContainerDied","Data":"d33e064fe84afbe39763af02540fa8b80b5c94333c03df3c8cfd4f52ca2d1d86"} Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.235473 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.256033 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hwttf" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.257026 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hwttf" event={"ID":"1c6b1bba-b68a-4912-aada-0229a7152426","Type":"ContainerDied","Data":"d7d9a78689b10e989ad0550955006abcbf461e038ee689ddeebdba8b1fc12418"} Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.257073 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7d9a78689b10e989ad0550955006abcbf461e038ee689ddeebdba8b1fc12418" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.285637 4757 generic.go:334] "Generic (PLEG): container finished" podID="8d35ab5a-645f-40c9-ba7a-288a5ed7722a" containerID="ff527dbfa546699e89e7186e5ed187a52a524af9029c37e3b88743cccc5c5789" exitCode=143 Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.285711 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d35ab5a-645f-40c9-ba7a-288a5ed7722a","Type":"ContainerDied","Data":"ff527dbfa546699e89e7186e5ed187a52a524af9029c37e3b88743cccc5c5789"} Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.292885 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1136893-32fb-41f6-97cb-466c3819677e","Type":"ContainerStarted","Data":"733ccb39b344b2e19f7c1f3b3a135db87e77a1c83326e4a9f6e5953b3bba18b0"} Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.293791 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.356327 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34dae10c-f09b-433d-8fb4-b218bddb1fb4-logs\") pod \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.356429 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/34dae10c-f09b-433d-8fb4-b218bddb1fb4-nova-metadata-tls-certs\") pod \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.356539 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34dae10c-f09b-433d-8fb4-b218bddb1fb4-combined-ca-bundle\") pod \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.356579 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34dae10c-f09b-433d-8fb4-b218bddb1fb4-config-data\") pod \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.356640 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxn7s\" (UniqueName: \"kubernetes.io/projected/34dae10c-f09b-433d-8fb4-b218bddb1fb4-kube-api-access-xxn7s\") pod \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\" (UID: \"34dae10c-f09b-433d-8fb4-b218bddb1fb4\") " Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.358619 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34dae10c-f09b-433d-8fb4-b218bddb1fb4-logs" (OuterVolumeSpecName: "logs") pod "34dae10c-f09b-433d-8fb4-b218bddb1fb4" (UID: "34dae10c-f09b-433d-8fb4-b218bddb1fb4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.364926 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.839503118 podStartE2EDuration="5.36490479s" podCreationTimestamp="2025-12-16 13:11:08 +0000 UTC" firstStartedPulling="2025-12-16 13:11:09.131369699 +0000 UTC m=+1454.559113495" lastFinishedPulling="2025-12-16 13:11:12.656771371 +0000 UTC m=+1458.084515167" observedRunningTime="2025-12-16 13:11:13.33384752 +0000 UTC m=+1458.761591326" watchObservedRunningTime="2025-12-16 13:11:13.36490479 +0000 UTC m=+1458.792648586" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.366855 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34dae10c-f09b-433d-8fb4-b218bddb1fb4-kube-api-access-xxn7s" (OuterVolumeSpecName: "kube-api-access-xxn7s") pod "34dae10c-f09b-433d-8fb4-b218bddb1fb4" (UID: "34dae10c-f09b-433d-8fb4-b218bddb1fb4"). InnerVolumeSpecName "kube-api-access-xxn7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.370344 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 13:11:13 crc kubenswrapper[4757]: E1216 13:11:13.371214 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34dae10c-f09b-433d-8fb4-b218bddb1fb4" containerName="nova-metadata-log" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.371232 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="34dae10c-f09b-433d-8fb4-b218bddb1fb4" containerName="nova-metadata-log" Dec 16 13:11:13 crc kubenswrapper[4757]: E1216 13:11:13.371271 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6b1bba-b68a-4912-aada-0229a7152426" containerName="nova-cell1-conductor-db-sync" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.371294 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6b1bba-b68a-4912-aada-0229a7152426" containerName="nova-cell1-conductor-db-sync" Dec 16 13:11:13 crc kubenswrapper[4757]: E1216 13:11:13.371304 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ea5e67-160d-47fd-9bb3-70141a4bcdb1" containerName="nova-manage" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.371310 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ea5e67-160d-47fd-9bb3-70141a4bcdb1" containerName="nova-manage" Dec 16 13:11:13 crc kubenswrapper[4757]: E1216 13:11:13.371322 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34dae10c-f09b-433d-8fb4-b218bddb1fb4" containerName="nova-metadata-metadata" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.371329 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="34dae10c-f09b-433d-8fb4-b218bddb1fb4" containerName="nova-metadata-metadata" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.371537 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="34dae10c-f09b-433d-8fb4-b218bddb1fb4" containerName="nova-metadata-log" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.371550 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="34dae10c-f09b-433d-8fb4-b218bddb1fb4" containerName="nova-metadata-metadata" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.371559 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="32ea5e67-160d-47fd-9bb3-70141a4bcdb1" containerName="nova-manage" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.371579 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6b1bba-b68a-4912-aada-0229a7152426" containerName="nova-cell1-conductor-db-sync" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.372387 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.377484 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.417504 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34dae10c-f09b-433d-8fb4-b218bddb1fb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34dae10c-f09b-433d-8fb4-b218bddb1fb4" (UID: "34dae10c-f09b-433d-8fb4-b218bddb1fb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.424035 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.449643 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34dae10c-f09b-433d-8fb4-b218bddb1fb4-config-data" (OuterVolumeSpecName: "config-data") pod "34dae10c-f09b-433d-8fb4-b218bddb1fb4" (UID: "34dae10c-f09b-433d-8fb4-b218bddb1fb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.459572 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-276gc\" (UniqueName: \"kubernetes.io/projected/f7f0c537-530b-4e4a-ae96-35ba695d26be-kube-api-access-276gc\") pod \"nova-cell1-conductor-0\" (UID: \"f7f0c537-530b-4e4a-ae96-35ba695d26be\") " pod="openstack/nova-cell1-conductor-0" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.459777 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7f0c537-530b-4e4a-ae96-35ba695d26be-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f7f0c537-530b-4e4a-ae96-35ba695d26be\") " pod="openstack/nova-cell1-conductor-0" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.459823 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f0c537-530b-4e4a-ae96-35ba695d26be-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f7f0c537-530b-4e4a-ae96-35ba695d26be\") " pod="openstack/nova-cell1-conductor-0" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.459961 4757 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34dae10c-f09b-433d-8fb4-b218bddb1fb4-logs\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.459979 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34dae10c-f09b-433d-8fb4-b218bddb1fb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.459993 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34dae10c-f09b-433d-8fb4-b218bddb1fb4-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.460023 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxn7s\" (UniqueName: \"kubernetes.io/projected/34dae10c-f09b-433d-8fb4-b218bddb1fb4-kube-api-access-xxn7s\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.489195 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34dae10c-f09b-433d-8fb4-b218bddb1fb4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "34dae10c-f09b-433d-8fb4-b218bddb1fb4" (UID: "34dae10c-f09b-433d-8fb4-b218bddb1fb4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.562330 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7f0c537-530b-4e4a-ae96-35ba695d26be-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f7f0c537-530b-4e4a-ae96-35ba695d26be\") " pod="openstack/nova-cell1-conductor-0" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.562404 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f0c537-530b-4e4a-ae96-35ba695d26be-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f7f0c537-530b-4e4a-ae96-35ba695d26be\") " pod="openstack/nova-cell1-conductor-0" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.562492 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-276gc\" (UniqueName: \"kubernetes.io/projected/f7f0c537-530b-4e4a-ae96-35ba695d26be-kube-api-access-276gc\") pod \"nova-cell1-conductor-0\" (UID: \"f7f0c537-530b-4e4a-ae96-35ba695d26be\") " pod="openstack/nova-cell1-conductor-0" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.562622 4757 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/34dae10c-f09b-433d-8fb4-b218bddb1fb4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.567470 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f0c537-530b-4e4a-ae96-35ba695d26be-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f7f0c537-530b-4e4a-ae96-35ba695d26be\") " pod="openstack/nova-cell1-conductor-0" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.567534 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7f0c537-530b-4e4a-ae96-35ba695d26be-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f7f0c537-530b-4e4a-ae96-35ba695d26be\") " pod="openstack/nova-cell1-conductor-0" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.578201 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-276gc\" (UniqueName: \"kubernetes.io/projected/f7f0c537-530b-4e4a-ae96-35ba695d26be-kube-api-access-276gc\") pod \"nova-cell1-conductor-0\" (UID: \"f7f0c537-530b-4e4a-ae96-35ba695d26be\") " pod="openstack/nova-cell1-conductor-0" Dec 16 13:11:13 crc kubenswrapper[4757]: I1216 13:11:13.732585 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.265042 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.305095 4757 generic.go:334] "Generic (PLEG): container finished" podID="c8a8bd03-898d-43fb-84d1-700e8fff5a26" containerID="0c22707850366730ebe75dd0cfe5925f7c956dfb61016d4015be571d36fae94b" exitCode=0 Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.305163 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c8a8bd03-898d-43fb-84d1-700e8fff5a26","Type":"ContainerDied","Data":"0c22707850366730ebe75dd0cfe5925f7c956dfb61016d4015be571d36fae94b"} Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.305191 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c8a8bd03-898d-43fb-84d1-700e8fff5a26","Type":"ContainerDied","Data":"95a87bd64251adcb47e73fd7b2b3dd944d27b91b89dcc2fb2038e8f7b93884bf"} Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.305209 4757 scope.go:117] "RemoveContainer" containerID="0c22707850366730ebe75dd0cfe5925f7c956dfb61016d4015be571d36fae94b" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.305346 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.314012 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"34dae10c-f09b-433d-8fb4-b218bddb1fb4","Type":"ContainerDied","Data":"7d975329145fced263f1887d187296ac0f1f4803093af5ecb09230d45b05a73d"} Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.314063 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.336534 4757 scope.go:117] "RemoveContainer" containerID="0c22707850366730ebe75dd0cfe5925f7c956dfb61016d4015be571d36fae94b" Dec 16 13:11:14 crc kubenswrapper[4757]: E1216 13:11:14.337189 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c22707850366730ebe75dd0cfe5925f7c956dfb61016d4015be571d36fae94b\": container with ID starting with 0c22707850366730ebe75dd0cfe5925f7c956dfb61016d4015be571d36fae94b not found: ID does not exist" containerID="0c22707850366730ebe75dd0cfe5925f7c956dfb61016d4015be571d36fae94b" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.337224 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c22707850366730ebe75dd0cfe5925f7c956dfb61016d4015be571d36fae94b"} err="failed to get container status \"0c22707850366730ebe75dd0cfe5925f7c956dfb61016d4015be571d36fae94b\": rpc error: code = NotFound desc = could not find container \"0c22707850366730ebe75dd0cfe5925f7c956dfb61016d4015be571d36fae94b\": container with ID starting with 0c22707850366730ebe75dd0cfe5925f7c956dfb61016d4015be571d36fae94b not found: ID does not exist" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.337269 4757 scope.go:117] "RemoveContainer" containerID="eab31abc53465cb643d676123d942f87bcad36c8f1982932232dd68736d45edf" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.395348 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.397325 4757 scope.go:117] "RemoveContainer" containerID="d33e064fe84afbe39763af02540fa8b80b5c94333c03df3c8cfd4f52ca2d1d86" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.412354 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a8bd03-898d-43fb-84d1-700e8fff5a26-config-data\") pod \"c8a8bd03-898d-43fb-84d1-700e8fff5a26\" (UID: \"c8a8bd03-898d-43fb-84d1-700e8fff5a26\") " Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.412437 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a8bd03-898d-43fb-84d1-700e8fff5a26-combined-ca-bundle\") pod \"c8a8bd03-898d-43fb-84d1-700e8fff5a26\" (UID: \"c8a8bd03-898d-43fb-84d1-700e8fff5a26\") " Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.412556 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gfvv\" (UniqueName: \"kubernetes.io/projected/c8a8bd03-898d-43fb-84d1-700e8fff5a26-kube-api-access-4gfvv\") pod \"c8a8bd03-898d-43fb-84d1-700e8fff5a26\" (UID: \"c8a8bd03-898d-43fb-84d1-700e8fff5a26\") " Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.412712 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.426077 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:11:14 crc kubenswrapper[4757]: E1216 13:11:14.426517 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a8bd03-898d-43fb-84d1-700e8fff5a26" containerName="nova-scheduler-scheduler" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.426533 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a8bd03-898d-43fb-84d1-700e8fff5a26" containerName="nova-scheduler-scheduler" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.426716 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8a8bd03-898d-43fb-84d1-700e8fff5a26" containerName="nova-scheduler-scheduler" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.428153 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.432277 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.436304 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.439751 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a8bd03-898d-43fb-84d1-700e8fff5a26-kube-api-access-4gfvv" (OuterVolumeSpecName: "kube-api-access-4gfvv") pod "c8a8bd03-898d-43fb-84d1-700e8fff5a26" (UID: "c8a8bd03-898d-43fb-84d1-700e8fff5a26"). InnerVolumeSpecName "kube-api-access-4gfvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.446497 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.512431 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.515980 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " pod="openstack/nova-metadata-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.516024 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-config-data\") pod \"nova-metadata-0\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " pod="openstack/nova-metadata-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.516288 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " pod="openstack/nova-metadata-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.516358 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fm8s\" (UniqueName: \"kubernetes.io/projected/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-kube-api-access-4fm8s\") pod \"nova-metadata-0\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " pod="openstack/nova-metadata-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.516615 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-logs\") pod \"nova-metadata-0\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " pod="openstack/nova-metadata-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.516693 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gfvv\" (UniqueName: \"kubernetes.io/projected/c8a8bd03-898d-43fb-84d1-700e8fff5a26-kube-api-access-4gfvv\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.539058 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8a8bd03-898d-43fb-84d1-700e8fff5a26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8a8bd03-898d-43fb-84d1-700e8fff5a26" (UID: "c8a8bd03-898d-43fb-84d1-700e8fff5a26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.548214 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8a8bd03-898d-43fb-84d1-700e8fff5a26-config-data" (OuterVolumeSpecName: "config-data") pod "c8a8bd03-898d-43fb-84d1-700e8fff5a26" (UID: "c8a8bd03-898d-43fb-84d1-700e8fff5a26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.619702 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-logs\") pod \"nova-metadata-0\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " pod="openstack/nova-metadata-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.619810 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " pod="openstack/nova-metadata-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.619835 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-config-data\") pod \"nova-metadata-0\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " pod="openstack/nova-metadata-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.619877 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " pod="openstack/nova-metadata-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.619991 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fm8s\" (UniqueName: \"kubernetes.io/projected/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-kube-api-access-4fm8s\") pod \"nova-metadata-0\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " pod="openstack/nova-metadata-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.620075 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a8bd03-898d-43fb-84d1-700e8fff5a26-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.620087 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a8bd03-898d-43fb-84d1-700e8fff5a26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.621057 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-logs\") pod \"nova-metadata-0\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " pod="openstack/nova-metadata-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.629782 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " pod="openstack/nova-metadata-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.630129 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " pod="openstack/nova-metadata-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.631950 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-config-data\") pod \"nova-metadata-0\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " pod="openstack/nova-metadata-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.637758 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fm8s\" (UniqueName: \"kubernetes.io/projected/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-kube-api-access-4fm8s\") pod \"nova-metadata-0\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " pod="openstack/nova-metadata-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.782461 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.795757 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.813231 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.814710 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.815480 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.817627 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.828825 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc9f1f6-2c76-4a89-bcfc-24d45502027a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"edc9f1f6-2c76-4a89-bcfc-24d45502027a\") " pod="openstack/nova-scheduler-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.829526 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc9f1f6-2c76-4a89-bcfc-24d45502027a-config-data\") pod \"nova-scheduler-0\" (UID: \"edc9f1f6-2c76-4a89-bcfc-24d45502027a\") " pod="openstack/nova-scheduler-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.829673 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4n5m\" (UniqueName: \"kubernetes.io/projected/edc9f1f6-2c76-4a89-bcfc-24d45502027a-kube-api-access-s4n5m\") pod \"nova-scheduler-0\" (UID: \"edc9f1f6-2c76-4a89-bcfc-24d45502027a\") " pod="openstack/nova-scheduler-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.834472 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.932339 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4n5m\" (UniqueName: \"kubernetes.io/projected/edc9f1f6-2c76-4a89-bcfc-24d45502027a-kube-api-access-s4n5m\") pod \"nova-scheduler-0\" (UID: \"edc9f1f6-2c76-4a89-bcfc-24d45502027a\") " pod="openstack/nova-scheduler-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.933049 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc9f1f6-2c76-4a89-bcfc-24d45502027a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"edc9f1f6-2c76-4a89-bcfc-24d45502027a\") " pod="openstack/nova-scheduler-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.933341 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc9f1f6-2c76-4a89-bcfc-24d45502027a-config-data\") pod \"nova-scheduler-0\" (UID: \"edc9f1f6-2c76-4a89-bcfc-24d45502027a\") " pod="openstack/nova-scheduler-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.939980 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc9f1f6-2c76-4a89-bcfc-24d45502027a-config-data\") pod \"nova-scheduler-0\" (UID: \"edc9f1f6-2c76-4a89-bcfc-24d45502027a\") " pod="openstack/nova-scheduler-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.976586 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34dae10c-f09b-433d-8fb4-b218bddb1fb4" path="/var/lib/kubelet/pods/34dae10c-f09b-433d-8fb4-b218bddb1fb4/volumes" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.977468 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8a8bd03-898d-43fb-84d1-700e8fff5a26" path="/var/lib/kubelet/pods/c8a8bd03-898d-43fb-84d1-700e8fff5a26/volumes" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.984131 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4n5m\" (UniqueName: \"kubernetes.io/projected/edc9f1f6-2c76-4a89-bcfc-24d45502027a-kube-api-access-s4n5m\") pod \"nova-scheduler-0\" (UID: \"edc9f1f6-2c76-4a89-bcfc-24d45502027a\") " pod="openstack/nova-scheduler-0" Dec 16 13:11:14 crc kubenswrapper[4757]: I1216 13:11:14.984949 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc9f1f6-2c76-4a89-bcfc-24d45502027a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"edc9f1f6-2c76-4a89-bcfc-24d45502027a\") " pod="openstack/nova-scheduler-0" Dec 16 13:11:15 crc kubenswrapper[4757]: I1216 13:11:15.203516 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 13:11:15 crc kubenswrapper[4757]: I1216 13:11:15.346186 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f7f0c537-530b-4e4a-ae96-35ba695d26be","Type":"ContainerStarted","Data":"acfdd3af49416006fcba96ce8cae880a02fa5d7b77bdbc226c206149240eb299"} Dec 16 13:11:15 crc kubenswrapper[4757]: I1216 13:11:15.346229 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f7f0c537-530b-4e4a-ae96-35ba695d26be","Type":"ContainerStarted","Data":"d3371190b7751433f247af8ab4db92966e4106c1490a820d3a2af8b875bca809"} Dec 16 13:11:15 crc kubenswrapper[4757]: I1216 13:11:15.347631 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 16 13:11:15 crc kubenswrapper[4757]: I1216 13:11:15.389125 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:11:15 crc kubenswrapper[4757]: W1216 13:11:15.394217 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34ac3c8a_f8cf_404e_93ed_8c422df4bebf.slice/crio-60ab2d04e2feb9604c7c4db879aaeae2a4fbabaea63fa95b19c72985439b207b WatchSource:0}: Error finding container 60ab2d04e2feb9604c7c4db879aaeae2a4fbabaea63fa95b19c72985439b207b: Status 404 returned error can't find the container with id 60ab2d04e2feb9604c7c4db879aaeae2a4fbabaea63fa95b19c72985439b207b Dec 16 13:11:15 crc kubenswrapper[4757]: I1216 13:11:15.401415 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.401388545 podStartE2EDuration="2.401388545s" podCreationTimestamp="2025-12-16 13:11:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:11:15.372537641 +0000 UTC m=+1460.800281447" watchObservedRunningTime="2025-12-16 13:11:15.401388545 +0000 UTC m=+1460.829132341" Dec 16 13:11:15 crc kubenswrapper[4757]: W1216 13:11:15.796934 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedc9f1f6_2c76_4a89_bcfc_24d45502027a.slice/crio-39310a2e6c37b72fe30d7feca6d857303905aa41a11eb9f51d06c6bf4aea4049 WatchSource:0}: Error finding container 39310a2e6c37b72fe30d7feca6d857303905aa41a11eb9f51d06c6bf4aea4049: Status 404 returned error can't find the container with id 39310a2e6c37b72fe30d7feca6d857303905aa41a11eb9f51d06c6bf4aea4049 Dec 16 13:11:15 crc kubenswrapper[4757]: I1216 13:11:15.808597 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.331255 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.417159 4757 generic.go:334] "Generic (PLEG): container finished" podID="8d35ab5a-645f-40c9-ba7a-288a5ed7722a" containerID="b2fe659da9b648e5cd0a279d9855d55482e43dd365f37bbf2c223891db85da32" exitCode=0 Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.417224 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d35ab5a-645f-40c9-ba7a-288a5ed7722a","Type":"ContainerDied","Data":"b2fe659da9b648e5cd0a279d9855d55482e43dd365f37bbf2c223891db85da32"} Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.417251 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8d35ab5a-645f-40c9-ba7a-288a5ed7722a","Type":"ContainerDied","Data":"d6300d7b8ff2ed3764443c70d28e1065694c49b3a805bdc3f49c3a84b536ace9"} Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.417267 4757 scope.go:117] "RemoveContainer" containerID="b2fe659da9b648e5cd0a279d9855d55482e43dd365f37bbf2c223891db85da32" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.417390 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.428538 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"edc9f1f6-2c76-4a89-bcfc-24d45502027a","Type":"ContainerStarted","Data":"a917624397914497299cbdcd6f524c88b345584c50c71a71ec06f07930b6a0d3"} Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.428577 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"edc9f1f6-2c76-4a89-bcfc-24d45502027a","Type":"ContainerStarted","Data":"39310a2e6c37b72fe30d7feca6d857303905aa41a11eb9f51d06c6bf4aea4049"} Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.437057 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"34ac3c8a-f8cf-404e-93ed-8c422df4bebf","Type":"ContainerStarted","Data":"690f26d2d51b88d356f69e48ac1a90ec32376097bd7392ec73c32166efe92b52"} Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.437109 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"34ac3c8a-f8cf-404e-93ed-8c422df4bebf","Type":"ContainerStarted","Data":"5355b484c84eb930d16149103d41f17c6a59d839b97b3ce015fbf779f794595a"} Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.437119 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"34ac3c8a-f8cf-404e-93ed-8c422df4bebf","Type":"ContainerStarted","Data":"60ab2d04e2feb9604c7c4db879aaeae2a4fbabaea63fa95b19c72985439b207b"} Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.478042 4757 scope.go:117] "RemoveContainer" containerID="ff527dbfa546699e89e7186e5ed187a52a524af9029c37e3b88743cccc5c5789" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.481571 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcgbq\" (UniqueName: \"kubernetes.io/projected/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-kube-api-access-fcgbq\") pod \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\" (UID: \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\") " Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.481624 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-config-data\") pod \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\" (UID: \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\") " Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.481647 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-logs\") pod \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\" (UID: \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\") " Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.481703 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-combined-ca-bundle\") pod \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\" (UID: \"8d35ab5a-645f-40c9-ba7a-288a5ed7722a\") " Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.483103 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-logs" (OuterVolumeSpecName: "logs") pod "8d35ab5a-645f-40c9-ba7a-288a5ed7722a" (UID: "8d35ab5a-645f-40c9-ba7a-288a5ed7722a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.490573 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.490550975 podStartE2EDuration="2.490550975s" podCreationTimestamp="2025-12-16 13:11:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:11:16.478529568 +0000 UTC m=+1461.906273364" watchObservedRunningTime="2025-12-16 13:11:16.490550975 +0000 UTC m=+1461.918294771" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.504645 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-kube-api-access-fcgbq" (OuterVolumeSpecName: "kube-api-access-fcgbq") pod "8d35ab5a-645f-40c9-ba7a-288a5ed7722a" (UID: "8d35ab5a-645f-40c9-ba7a-288a5ed7722a"). InnerVolumeSpecName "kube-api-access-fcgbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.536613 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.536585496 podStartE2EDuration="2.536585496s" podCreationTimestamp="2025-12-16 13:11:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:11:16.531583943 +0000 UTC m=+1461.959327729" watchObservedRunningTime="2025-12-16 13:11:16.536585496 +0000 UTC m=+1461.964329292" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.548222 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d35ab5a-645f-40c9-ba7a-288a5ed7722a" (UID: "8d35ab5a-645f-40c9-ba7a-288a5ed7722a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.548865 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-config-data" (OuterVolumeSpecName: "config-data") pod "8d35ab5a-645f-40c9-ba7a-288a5ed7722a" (UID: "8d35ab5a-645f-40c9-ba7a-288a5ed7722a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.583664 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcgbq\" (UniqueName: \"kubernetes.io/projected/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-kube-api-access-fcgbq\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.583722 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.583738 4757 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-logs\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.583748 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d35ab5a-645f-40c9-ba7a-288a5ed7722a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.704326 4757 scope.go:117] "RemoveContainer" containerID="b2fe659da9b648e5cd0a279d9855d55482e43dd365f37bbf2c223891db85da32" Dec 16 13:11:16 crc kubenswrapper[4757]: E1216 13:11:16.717201 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2fe659da9b648e5cd0a279d9855d55482e43dd365f37bbf2c223891db85da32\": container with ID starting with b2fe659da9b648e5cd0a279d9855d55482e43dd365f37bbf2c223891db85da32 not found: ID does not exist" containerID="b2fe659da9b648e5cd0a279d9855d55482e43dd365f37bbf2c223891db85da32" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.717254 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2fe659da9b648e5cd0a279d9855d55482e43dd365f37bbf2c223891db85da32"} err="failed to get container status \"b2fe659da9b648e5cd0a279d9855d55482e43dd365f37bbf2c223891db85da32\": rpc error: code = NotFound desc = could not find container \"b2fe659da9b648e5cd0a279d9855d55482e43dd365f37bbf2c223891db85da32\": container with ID starting with b2fe659da9b648e5cd0a279d9855d55482e43dd365f37bbf2c223891db85da32 not found: ID does not exist" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.717294 4757 scope.go:117] "RemoveContainer" containerID="ff527dbfa546699e89e7186e5ed187a52a524af9029c37e3b88743cccc5c5789" Dec 16 13:11:16 crc kubenswrapper[4757]: E1216 13:11:16.721086 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff527dbfa546699e89e7186e5ed187a52a524af9029c37e3b88743cccc5c5789\": container with ID starting with ff527dbfa546699e89e7186e5ed187a52a524af9029c37e3b88743cccc5c5789 not found: ID does not exist" containerID="ff527dbfa546699e89e7186e5ed187a52a524af9029c37e3b88743cccc5c5789" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.721133 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff527dbfa546699e89e7186e5ed187a52a524af9029c37e3b88743cccc5c5789"} err="failed to get container status \"ff527dbfa546699e89e7186e5ed187a52a524af9029c37e3b88743cccc5c5789\": rpc error: code = NotFound desc = could not find container \"ff527dbfa546699e89e7186e5ed187a52a524af9029c37e3b88743cccc5c5789\": container with ID starting with ff527dbfa546699e89e7186e5ed187a52a524af9029c37e3b88743cccc5c5789 not found: ID does not exist" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.799081 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.809274 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.824744 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 13:11:16 crc kubenswrapper[4757]: E1216 13:11:16.828801 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d35ab5a-645f-40c9-ba7a-288a5ed7722a" containerName="nova-api-api" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.828839 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d35ab5a-645f-40c9-ba7a-288a5ed7722a" containerName="nova-api-api" Dec 16 13:11:16 crc kubenswrapper[4757]: E1216 13:11:16.828878 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d35ab5a-645f-40c9-ba7a-288a5ed7722a" containerName="nova-api-log" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.828886 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d35ab5a-645f-40c9-ba7a-288a5ed7722a" containerName="nova-api-log" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.829141 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d35ab5a-645f-40c9-ba7a-288a5ed7722a" containerName="nova-api-log" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.829158 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d35ab5a-645f-40c9-ba7a-288a5ed7722a" containerName="nova-api-api" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.830145 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.850583 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.851296 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.900038 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8a1f83-5373-486b-b6a5-f5631745dcf9-config-data\") pod \"nova-api-0\" (UID: \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\") " pod="openstack/nova-api-0" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.900136 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8a1f83-5373-486b-b6a5-f5631745dcf9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\") " pod="openstack/nova-api-0" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.900165 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lq4h\" (UniqueName: \"kubernetes.io/projected/5c8a1f83-5373-486b-b6a5-f5631745dcf9-kube-api-access-8lq4h\") pod \"nova-api-0\" (UID: \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\") " pod="openstack/nova-api-0" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.900214 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c8a1f83-5373-486b-b6a5-f5631745dcf9-logs\") pod \"nova-api-0\" (UID: \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\") " pod="openstack/nova-api-0" Dec 16 13:11:16 crc kubenswrapper[4757]: I1216 13:11:16.960017 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d35ab5a-645f-40c9-ba7a-288a5ed7722a" path="/var/lib/kubelet/pods/8d35ab5a-645f-40c9-ba7a-288a5ed7722a/volumes" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.001669 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8a1f83-5373-486b-b6a5-f5631745dcf9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\") " pod="openstack/nova-api-0" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.001730 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lq4h\" (UniqueName: \"kubernetes.io/projected/5c8a1f83-5373-486b-b6a5-f5631745dcf9-kube-api-access-8lq4h\") pod \"nova-api-0\" (UID: \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\") " pod="openstack/nova-api-0" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.001794 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c8a1f83-5373-486b-b6a5-f5631745dcf9-logs\") pod \"nova-api-0\" (UID: \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\") " pod="openstack/nova-api-0" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.001872 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8a1f83-5373-486b-b6a5-f5631745dcf9-config-data\") pod \"nova-api-0\" (UID: \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\") " pod="openstack/nova-api-0" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.002541 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c8a1f83-5373-486b-b6a5-f5631745dcf9-logs\") pod \"nova-api-0\" (UID: \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\") " pod="openstack/nova-api-0" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.018156 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8a1f83-5373-486b-b6a5-f5631745dcf9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\") " pod="openstack/nova-api-0" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.018225 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8a1f83-5373-486b-b6a5-f5631745dcf9-config-data\") pod \"nova-api-0\" (UID: \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\") " pod="openstack/nova-api-0" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.030800 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lq4h\" (UniqueName: \"kubernetes.io/projected/5c8a1f83-5373-486b-b6a5-f5631745dcf9-kube-api-access-8lq4h\") pod \"nova-api-0\" (UID: \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\") " pod="openstack/nova-api-0" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.146331 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.226093 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hnvl7"] Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.228768 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hnvl7" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.254207 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hnvl7"] Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.415520 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29c98223-0758-44c5-b763-0f1b6e296d13-utilities\") pod \"certified-operators-hnvl7\" (UID: \"29c98223-0758-44c5-b763-0f1b6e296d13\") " pod="openshift-marketplace/certified-operators-hnvl7" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.415933 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4rld\" (UniqueName: \"kubernetes.io/projected/29c98223-0758-44c5-b763-0f1b6e296d13-kube-api-access-n4rld\") pod \"certified-operators-hnvl7\" (UID: \"29c98223-0758-44c5-b763-0f1b6e296d13\") " pod="openshift-marketplace/certified-operators-hnvl7" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.416034 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29c98223-0758-44c5-b763-0f1b6e296d13-catalog-content\") pod \"certified-operators-hnvl7\" (UID: \"29c98223-0758-44c5-b763-0f1b6e296d13\") " pod="openshift-marketplace/certified-operators-hnvl7" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.517720 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29c98223-0758-44c5-b763-0f1b6e296d13-utilities\") pod \"certified-operators-hnvl7\" (UID: \"29c98223-0758-44c5-b763-0f1b6e296d13\") " pod="openshift-marketplace/certified-operators-hnvl7" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.517785 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4rld\" (UniqueName: \"kubernetes.io/projected/29c98223-0758-44c5-b763-0f1b6e296d13-kube-api-access-n4rld\") pod \"certified-operators-hnvl7\" (UID: \"29c98223-0758-44c5-b763-0f1b6e296d13\") " pod="openshift-marketplace/certified-operators-hnvl7" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.517848 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29c98223-0758-44c5-b763-0f1b6e296d13-catalog-content\") pod \"certified-operators-hnvl7\" (UID: \"29c98223-0758-44c5-b763-0f1b6e296d13\") " pod="openshift-marketplace/certified-operators-hnvl7" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.518572 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29c98223-0758-44c5-b763-0f1b6e296d13-catalog-content\") pod \"certified-operators-hnvl7\" (UID: \"29c98223-0758-44c5-b763-0f1b6e296d13\") " pod="openshift-marketplace/certified-operators-hnvl7" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.518829 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29c98223-0758-44c5-b763-0f1b6e296d13-utilities\") pod \"certified-operators-hnvl7\" (UID: \"29c98223-0758-44c5-b763-0f1b6e296d13\") " pod="openshift-marketplace/certified-operators-hnvl7" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.545852 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4rld\" (UniqueName: \"kubernetes.io/projected/29c98223-0758-44c5-b763-0f1b6e296d13-kube-api-access-n4rld\") pod \"certified-operators-hnvl7\" (UID: \"29c98223-0758-44c5-b763-0f1b6e296d13\") " pod="openshift-marketplace/certified-operators-hnvl7" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.554601 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hnvl7" Dec 16 13:11:17 crc kubenswrapper[4757]: I1216 13:11:17.814074 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 13:11:18 crc kubenswrapper[4757]: I1216 13:11:18.194740 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hnvl7"] Dec 16 13:11:18 crc kubenswrapper[4757]: W1216 13:11:18.207199 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29c98223_0758_44c5_b763_0f1b6e296d13.slice/crio-b58fc008f37866147c2b1f3f0f8a8284fb0c7a12d049809b986177a1906fb8ec WatchSource:0}: Error finding container b58fc008f37866147c2b1f3f0f8a8284fb0c7a12d049809b986177a1906fb8ec: Status 404 returned error can't find the container with id b58fc008f37866147c2b1f3f0f8a8284fb0c7a12d049809b986177a1906fb8ec Dec 16 13:11:18 crc kubenswrapper[4757]: I1216 13:11:18.462449 4757 generic.go:334] "Generic (PLEG): container finished" podID="29c98223-0758-44c5-b763-0f1b6e296d13" containerID="5e2c1ae8a21955f78dc082b4bea2f713ebab187772e671b4d006df20f49d9003" exitCode=0 Dec 16 13:11:18 crc kubenswrapper[4757]: I1216 13:11:18.462533 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hnvl7" event={"ID":"29c98223-0758-44c5-b763-0f1b6e296d13","Type":"ContainerDied","Data":"5e2c1ae8a21955f78dc082b4bea2f713ebab187772e671b4d006df20f49d9003"} Dec 16 13:11:18 crc kubenswrapper[4757]: I1216 13:11:18.462563 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hnvl7" event={"ID":"29c98223-0758-44c5-b763-0f1b6e296d13","Type":"ContainerStarted","Data":"b58fc008f37866147c2b1f3f0f8a8284fb0c7a12d049809b986177a1906fb8ec"} Dec 16 13:11:18 crc kubenswrapper[4757]: I1216 13:11:18.468461 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c8a1f83-5373-486b-b6a5-f5631745dcf9","Type":"ContainerStarted","Data":"fdc6dd78342cd5cdcf02c11acc6cca93513f52dcaaee1907c98516e80d1cbfe5"} Dec 16 13:11:18 crc kubenswrapper[4757]: I1216 13:11:18.469690 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c8a1f83-5373-486b-b6a5-f5631745dcf9","Type":"ContainerStarted","Data":"31bb3f53b943cf96c323d7c5c5d16820263552932ccb3f2214803c3b91c4676f"} Dec 16 13:11:18 crc kubenswrapper[4757]: I1216 13:11:18.469798 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c8a1f83-5373-486b-b6a5-f5631745dcf9","Type":"ContainerStarted","Data":"abb120809f8e2e9336d34e29da0e6ce855b6724dd7f76f15f358396ef0876748"} Dec 16 13:11:18 crc kubenswrapper[4757]: I1216 13:11:18.511785 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.511762353 podStartE2EDuration="2.511762353s" podCreationTimestamp="2025-12-16 13:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:11:18.510161394 +0000 UTC m=+1463.937905210" watchObservedRunningTime="2025-12-16 13:11:18.511762353 +0000 UTC m=+1463.939506149" Dec 16 13:11:19 crc kubenswrapper[4757]: I1216 13:11:19.480718 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hnvl7" event={"ID":"29c98223-0758-44c5-b763-0f1b6e296d13","Type":"ContainerStarted","Data":"bcbb06d6c7498bcc7c2f015f96101ad08e92716d5a08d4efd1c68ef99c90356a"} Dec 16 13:11:19 crc kubenswrapper[4757]: I1216 13:11:19.816287 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 13:11:19 crc kubenswrapper[4757]: I1216 13:11:19.816343 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 13:11:20 crc kubenswrapper[4757]: I1216 13:11:20.203970 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 13:11:21 crc kubenswrapper[4757]: I1216 13:11:21.181193 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:11:21 crc kubenswrapper[4757]: I1216 13:11:21.181284 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:11:21 crc kubenswrapper[4757]: E1216 13:11:21.437768 4757 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29c98223_0758_44c5_b763_0f1b6e296d13.slice/crio-bcbb06d6c7498bcc7c2f015f96101ad08e92716d5a08d4efd1c68ef99c90356a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29c98223_0758_44c5_b763_0f1b6e296d13.slice/crio-conmon-bcbb06d6c7498bcc7c2f015f96101ad08e92716d5a08d4efd1c68ef99c90356a.scope\": RecentStats: unable to find data in memory cache]" Dec 16 13:11:21 crc kubenswrapper[4757]: I1216 13:11:21.509655 4757 generic.go:334] "Generic (PLEG): container finished" podID="29c98223-0758-44c5-b763-0f1b6e296d13" containerID="bcbb06d6c7498bcc7c2f015f96101ad08e92716d5a08d4efd1c68ef99c90356a" exitCode=0 Dec 16 13:11:21 crc kubenswrapper[4757]: I1216 13:11:21.509746 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hnvl7" event={"ID":"29c98223-0758-44c5-b763-0f1b6e296d13","Type":"ContainerDied","Data":"bcbb06d6c7498bcc7c2f015f96101ad08e92716d5a08d4efd1c68ef99c90356a"} Dec 16 13:11:22 crc kubenswrapper[4757]: I1216 13:11:22.519195 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hnvl7" event={"ID":"29c98223-0758-44c5-b763-0f1b6e296d13","Type":"ContainerStarted","Data":"4c45cedf7bdf3b784314fd09a74dab3b7192ef4d56f8aeffd3e45dd94272c6ab"} Dec 16 13:11:22 crc kubenswrapper[4757]: I1216 13:11:22.548221 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hnvl7" podStartSLOduration=1.754343054 podStartE2EDuration="5.548189409s" podCreationTimestamp="2025-12-16 13:11:17 +0000 UTC" firstStartedPulling="2025-12-16 13:11:18.464507652 +0000 UTC m=+1463.892251448" lastFinishedPulling="2025-12-16 13:11:22.258354007 +0000 UTC m=+1467.686097803" observedRunningTime="2025-12-16 13:11:22.54621823 +0000 UTC m=+1467.973962036" watchObservedRunningTime="2025-12-16 13:11:22.548189409 +0000 UTC m=+1467.975933205" Dec 16 13:11:23 crc kubenswrapper[4757]: I1216 13:11:23.764079 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 16 13:11:24 crc kubenswrapper[4757]: I1216 13:11:24.816095 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 13:11:24 crc kubenswrapper[4757]: I1216 13:11:24.816417 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 13:11:25 crc kubenswrapper[4757]: I1216 13:11:25.204068 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 13:11:25 crc kubenswrapper[4757]: I1216 13:11:25.231300 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 13:11:25 crc kubenswrapper[4757]: I1216 13:11:25.568368 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 13:11:25 crc kubenswrapper[4757]: I1216 13:11:25.829321 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="34ac3c8a-f8cf-404e-93ed-8c422df4bebf" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 13:11:25 crc kubenswrapper[4757]: I1216 13:11:25.829374 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="34ac3c8a-f8cf-404e-93ed-8c422df4bebf" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:11:27 crc kubenswrapper[4757]: I1216 13:11:27.147373 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 13:11:27 crc kubenswrapper[4757]: I1216 13:11:27.147623 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 13:11:27 crc kubenswrapper[4757]: I1216 13:11:27.554782 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hnvl7" Dec 16 13:11:27 crc kubenswrapper[4757]: I1216 13:11:27.555790 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hnvl7" Dec 16 13:11:27 crc kubenswrapper[4757]: I1216 13:11:27.615423 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hnvl7" Dec 16 13:11:28 crc kubenswrapper[4757]: I1216 13:11:28.229150 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5c8a1f83-5373-486b-b6a5-f5631745dcf9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:11:28 crc kubenswrapper[4757]: I1216 13:11:28.229247 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5c8a1f83-5373-486b-b6a5-f5631745dcf9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:11:28 crc kubenswrapper[4757]: I1216 13:11:28.613968 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hnvl7" Dec 16 13:11:28 crc kubenswrapper[4757]: I1216 13:11:28.675884 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hnvl7"] Dec 16 13:11:30 crc kubenswrapper[4757]: I1216 13:11:30.585024 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hnvl7" podUID="29c98223-0758-44c5-b763-0f1b6e296d13" containerName="registry-server" containerID="cri-o://4c45cedf7bdf3b784314fd09a74dab3b7192ef4d56f8aeffd3e45dd94272c6ab" gracePeriod=2 Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.142971 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hnvl7" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.227452 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4rld\" (UniqueName: \"kubernetes.io/projected/29c98223-0758-44c5-b763-0f1b6e296d13-kube-api-access-n4rld\") pod \"29c98223-0758-44c5-b763-0f1b6e296d13\" (UID: \"29c98223-0758-44c5-b763-0f1b6e296d13\") " Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.227542 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29c98223-0758-44c5-b763-0f1b6e296d13-catalog-content\") pod \"29c98223-0758-44c5-b763-0f1b6e296d13\" (UID: \"29c98223-0758-44c5-b763-0f1b6e296d13\") " Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.236549 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29c98223-0758-44c5-b763-0f1b6e296d13-kube-api-access-n4rld" (OuterVolumeSpecName: "kube-api-access-n4rld") pod "29c98223-0758-44c5-b763-0f1b6e296d13" (UID: "29c98223-0758-44c5-b763-0f1b6e296d13"). InnerVolumeSpecName "kube-api-access-n4rld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.282249 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29c98223-0758-44c5-b763-0f1b6e296d13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29c98223-0758-44c5-b763-0f1b6e296d13" (UID: "29c98223-0758-44c5-b763-0f1b6e296d13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.329473 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29c98223-0758-44c5-b763-0f1b6e296d13-utilities\") pod \"29c98223-0758-44c5-b763-0f1b6e296d13\" (UID: \"29c98223-0758-44c5-b763-0f1b6e296d13\") " Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.330094 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4rld\" (UniqueName: \"kubernetes.io/projected/29c98223-0758-44c5-b763-0f1b6e296d13-kube-api-access-n4rld\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.330114 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29c98223-0758-44c5-b763-0f1b6e296d13-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.330254 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29c98223-0758-44c5-b763-0f1b6e296d13-utilities" (OuterVolumeSpecName: "utilities") pod "29c98223-0758-44c5-b763-0f1b6e296d13" (UID: "29c98223-0758-44c5-b763-0f1b6e296d13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.344045 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.431592 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29c98223-0758-44c5-b763-0f1b6e296d13-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.532593 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0357997f-6169-4fc2-9c56-e4ccfb8fb694-config-data\") pod \"0357997f-6169-4fc2-9c56-e4ccfb8fb694\" (UID: \"0357997f-6169-4fc2-9c56-e4ccfb8fb694\") " Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.532937 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mflgx\" (UniqueName: \"kubernetes.io/projected/0357997f-6169-4fc2-9c56-e4ccfb8fb694-kube-api-access-mflgx\") pod \"0357997f-6169-4fc2-9c56-e4ccfb8fb694\" (UID: \"0357997f-6169-4fc2-9c56-e4ccfb8fb694\") " Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.533045 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0357997f-6169-4fc2-9c56-e4ccfb8fb694-combined-ca-bundle\") pod \"0357997f-6169-4fc2-9c56-e4ccfb8fb694\" (UID: \"0357997f-6169-4fc2-9c56-e4ccfb8fb694\") " Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.536486 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0357997f-6169-4fc2-9c56-e4ccfb8fb694-kube-api-access-mflgx" (OuterVolumeSpecName: "kube-api-access-mflgx") pod "0357997f-6169-4fc2-9c56-e4ccfb8fb694" (UID: "0357997f-6169-4fc2-9c56-e4ccfb8fb694"). InnerVolumeSpecName "kube-api-access-mflgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.558278 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0357997f-6169-4fc2-9c56-e4ccfb8fb694-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0357997f-6169-4fc2-9c56-e4ccfb8fb694" (UID: "0357997f-6169-4fc2-9c56-e4ccfb8fb694"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.567188 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0357997f-6169-4fc2-9c56-e4ccfb8fb694-config-data" (OuterVolumeSpecName: "config-data") pod "0357997f-6169-4fc2-9c56-e4ccfb8fb694" (UID: "0357997f-6169-4fc2-9c56-e4ccfb8fb694"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.596816 4757 generic.go:334] "Generic (PLEG): container finished" podID="29c98223-0758-44c5-b763-0f1b6e296d13" containerID="4c45cedf7bdf3b784314fd09a74dab3b7192ef4d56f8aeffd3e45dd94272c6ab" exitCode=0 Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.596874 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hnvl7" event={"ID":"29c98223-0758-44c5-b763-0f1b6e296d13","Type":"ContainerDied","Data":"4c45cedf7bdf3b784314fd09a74dab3b7192ef4d56f8aeffd3e45dd94272c6ab"} Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.596899 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hnvl7" event={"ID":"29c98223-0758-44c5-b763-0f1b6e296d13","Type":"ContainerDied","Data":"b58fc008f37866147c2b1f3f0f8a8284fb0c7a12d049809b986177a1906fb8ec"} Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.596916 4757 scope.go:117] "RemoveContainer" containerID="4c45cedf7bdf3b784314fd09a74dab3b7192ef4d56f8aeffd3e45dd94272c6ab" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.597056 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hnvl7" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.605684 4757 generic.go:334] "Generic (PLEG): container finished" podID="0357997f-6169-4fc2-9c56-e4ccfb8fb694" containerID="8aeabf08ed91ef5fc532a84fb36c905138b203118f026b12383227af381a1f2d" exitCode=137 Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.605711 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.605731 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0357997f-6169-4fc2-9c56-e4ccfb8fb694","Type":"ContainerDied","Data":"8aeabf08ed91ef5fc532a84fb36c905138b203118f026b12383227af381a1f2d"} Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.606628 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0357997f-6169-4fc2-9c56-e4ccfb8fb694","Type":"ContainerDied","Data":"63d211ec315f5fdc9626001e2765f99b27d9d0d39dc4aaf6a0d7d93845be90ce"} Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.634080 4757 scope.go:117] "RemoveContainer" containerID="bcbb06d6c7498bcc7c2f015f96101ad08e92716d5a08d4efd1c68ef99c90356a" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.634912 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0357997f-6169-4fc2-9c56-e4ccfb8fb694-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.635031 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mflgx\" (UniqueName: \"kubernetes.io/projected/0357997f-6169-4fc2-9c56-e4ccfb8fb694-kube-api-access-mflgx\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.635044 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0357997f-6169-4fc2-9c56-e4ccfb8fb694-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.653224 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hnvl7"] Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.707197 4757 scope.go:117] "RemoveContainer" containerID="5e2c1ae8a21955f78dc082b4bea2f713ebab187772e671b4d006df20f49d9003" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.719276 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hnvl7"] Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.756205 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.773811 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.774210 4757 scope.go:117] "RemoveContainer" containerID="4c45cedf7bdf3b784314fd09a74dab3b7192ef4d56f8aeffd3e45dd94272c6ab" Dec 16 13:11:31 crc kubenswrapper[4757]: E1216 13:11:31.774670 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c45cedf7bdf3b784314fd09a74dab3b7192ef4d56f8aeffd3e45dd94272c6ab\": container with ID starting with 4c45cedf7bdf3b784314fd09a74dab3b7192ef4d56f8aeffd3e45dd94272c6ab not found: ID does not exist" containerID="4c45cedf7bdf3b784314fd09a74dab3b7192ef4d56f8aeffd3e45dd94272c6ab" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.774697 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c45cedf7bdf3b784314fd09a74dab3b7192ef4d56f8aeffd3e45dd94272c6ab"} err="failed to get container status \"4c45cedf7bdf3b784314fd09a74dab3b7192ef4d56f8aeffd3e45dd94272c6ab\": rpc error: code = NotFound desc = could not find container \"4c45cedf7bdf3b784314fd09a74dab3b7192ef4d56f8aeffd3e45dd94272c6ab\": container with ID starting with 4c45cedf7bdf3b784314fd09a74dab3b7192ef4d56f8aeffd3e45dd94272c6ab not found: ID does not exist" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.774718 4757 scope.go:117] "RemoveContainer" containerID="bcbb06d6c7498bcc7c2f015f96101ad08e92716d5a08d4efd1c68ef99c90356a" Dec 16 13:11:31 crc kubenswrapper[4757]: E1216 13:11:31.774909 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcbb06d6c7498bcc7c2f015f96101ad08e92716d5a08d4efd1c68ef99c90356a\": container with ID starting with bcbb06d6c7498bcc7c2f015f96101ad08e92716d5a08d4efd1c68ef99c90356a not found: ID does not exist" containerID="bcbb06d6c7498bcc7c2f015f96101ad08e92716d5a08d4efd1c68ef99c90356a" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.774928 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcbb06d6c7498bcc7c2f015f96101ad08e92716d5a08d4efd1c68ef99c90356a"} err="failed to get container status \"bcbb06d6c7498bcc7c2f015f96101ad08e92716d5a08d4efd1c68ef99c90356a\": rpc error: code = NotFound desc = could not find container \"bcbb06d6c7498bcc7c2f015f96101ad08e92716d5a08d4efd1c68ef99c90356a\": container with ID starting with bcbb06d6c7498bcc7c2f015f96101ad08e92716d5a08d4efd1c68ef99c90356a not found: ID does not exist" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.774941 4757 scope.go:117] "RemoveContainer" containerID="5e2c1ae8a21955f78dc082b4bea2f713ebab187772e671b4d006df20f49d9003" Dec 16 13:11:31 crc kubenswrapper[4757]: E1216 13:11:31.775369 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e2c1ae8a21955f78dc082b4bea2f713ebab187772e671b4d006df20f49d9003\": container with ID starting with 5e2c1ae8a21955f78dc082b4bea2f713ebab187772e671b4d006df20f49d9003 not found: ID does not exist" containerID="5e2c1ae8a21955f78dc082b4bea2f713ebab187772e671b4d006df20f49d9003" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.775418 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e2c1ae8a21955f78dc082b4bea2f713ebab187772e671b4d006df20f49d9003"} err="failed to get container status \"5e2c1ae8a21955f78dc082b4bea2f713ebab187772e671b4d006df20f49d9003\": rpc error: code = NotFound desc = could not find container \"5e2c1ae8a21955f78dc082b4bea2f713ebab187772e671b4d006df20f49d9003\": container with ID starting with 5e2c1ae8a21955f78dc082b4bea2f713ebab187772e671b4d006df20f49d9003 not found: ID does not exist" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.775433 4757 scope.go:117] "RemoveContainer" containerID="8aeabf08ed91ef5fc532a84fb36c905138b203118f026b12383227af381a1f2d" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.786844 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 13:11:31 crc kubenswrapper[4757]: E1216 13:11:31.787396 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29c98223-0758-44c5-b763-0f1b6e296d13" containerName="extract-utilities" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.787525 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="29c98223-0758-44c5-b763-0f1b6e296d13" containerName="extract-utilities" Dec 16 13:11:31 crc kubenswrapper[4757]: E1216 13:11:31.787700 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0357997f-6169-4fc2-9c56-e4ccfb8fb694" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.788191 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="0357997f-6169-4fc2-9c56-e4ccfb8fb694" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 13:11:31 crc kubenswrapper[4757]: E1216 13:11:31.788292 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29c98223-0758-44c5-b763-0f1b6e296d13" containerName="extract-content" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.788383 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="29c98223-0758-44c5-b763-0f1b6e296d13" containerName="extract-content" Dec 16 13:11:31 crc kubenswrapper[4757]: E1216 13:11:31.788462 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29c98223-0758-44c5-b763-0f1b6e296d13" containerName="registry-server" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.788535 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="29c98223-0758-44c5-b763-0f1b6e296d13" containerName="registry-server" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.789712 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="0357997f-6169-4fc2-9c56-e4ccfb8fb694" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.789818 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="29c98223-0758-44c5-b763-0f1b6e296d13" containerName="registry-server" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.790915 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.792578 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.793298 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.793523 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.803829 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.811439 4757 scope.go:117] "RemoveContainer" containerID="8aeabf08ed91ef5fc532a84fb36c905138b203118f026b12383227af381a1f2d" Dec 16 13:11:31 crc kubenswrapper[4757]: E1216 13:11:31.811908 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aeabf08ed91ef5fc532a84fb36c905138b203118f026b12383227af381a1f2d\": container with ID starting with 8aeabf08ed91ef5fc532a84fb36c905138b203118f026b12383227af381a1f2d not found: ID does not exist" containerID="8aeabf08ed91ef5fc532a84fb36c905138b203118f026b12383227af381a1f2d" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.811935 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aeabf08ed91ef5fc532a84fb36c905138b203118f026b12383227af381a1f2d"} err="failed to get container status \"8aeabf08ed91ef5fc532a84fb36c905138b203118f026b12383227af381a1f2d\": rpc error: code = NotFound desc = could not find container \"8aeabf08ed91ef5fc532a84fb36c905138b203118f026b12383227af381a1f2d\": container with ID starting with 8aeabf08ed91ef5fc532a84fb36c905138b203118f026b12383227af381a1f2d not found: ID does not exist" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.939686 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.940111 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9g9s\" (UniqueName: \"kubernetes.io/projected/9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd-kube-api-access-k9g9s\") pod \"nova-cell1-novncproxy-0\" (UID: \"9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.940335 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.940484 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:31 crc kubenswrapper[4757]: I1216 13:11:31.940635 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:32 crc kubenswrapper[4757]: I1216 13:11:32.042425 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:32 crc kubenswrapper[4757]: I1216 13:11:32.042784 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9g9s\" (UniqueName: \"kubernetes.io/projected/9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd-kube-api-access-k9g9s\") pod \"nova-cell1-novncproxy-0\" (UID: \"9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:32 crc kubenswrapper[4757]: I1216 13:11:32.042873 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:32 crc kubenswrapper[4757]: I1216 13:11:32.042947 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:32 crc kubenswrapper[4757]: I1216 13:11:32.043398 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:32 crc kubenswrapper[4757]: I1216 13:11:32.047709 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:32 crc kubenswrapper[4757]: I1216 13:11:32.047942 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:32 crc kubenswrapper[4757]: I1216 13:11:32.049884 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:32 crc kubenswrapper[4757]: I1216 13:11:32.051878 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:32 crc kubenswrapper[4757]: I1216 13:11:32.059089 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9g9s\" (UniqueName: \"kubernetes.io/projected/9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd-kube-api-access-k9g9s\") pod \"nova-cell1-novncproxy-0\" (UID: \"9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:32 crc kubenswrapper[4757]: I1216 13:11:32.118388 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:32 crc kubenswrapper[4757]: I1216 13:11:32.548339 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 13:11:32 crc kubenswrapper[4757]: I1216 13:11:32.618547 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd","Type":"ContainerStarted","Data":"98f27eb3409a8ed2a402ae7c1e5a1387f53030705b1b615bf7df67453f368f96"} Dec 16 13:11:32 crc kubenswrapper[4757]: I1216 13:11:32.967495 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0357997f-6169-4fc2-9c56-e4ccfb8fb694" path="/var/lib/kubelet/pods/0357997f-6169-4fc2-9c56-e4ccfb8fb694/volumes" Dec 16 13:11:32 crc kubenswrapper[4757]: I1216 13:11:32.969131 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29c98223-0758-44c5-b763-0f1b6e296d13" path="/var/lib/kubelet/pods/29c98223-0758-44c5-b763-0f1b6e296d13/volumes" Dec 16 13:11:33 crc kubenswrapper[4757]: I1216 13:11:33.628232 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd","Type":"ContainerStarted","Data":"90188252b88933d2ba2d4540f9783c2771b7440fc850a7625d2446898c572c80"} Dec 16 13:11:33 crc kubenswrapper[4757]: I1216 13:11:33.659862 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.659812214 podStartE2EDuration="2.659812214s" podCreationTimestamp="2025-12-16 13:11:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:11:33.644854023 +0000 UTC m=+1479.072597829" watchObservedRunningTime="2025-12-16 13:11:33.659812214 +0000 UTC m=+1479.087556010" Dec 16 13:11:34 crc kubenswrapper[4757]: I1216 13:11:34.823259 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 13:11:34 crc kubenswrapper[4757]: I1216 13:11:34.828566 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 13:11:34 crc kubenswrapper[4757]: I1216 13:11:34.833151 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 13:11:35 crc kubenswrapper[4757]: I1216 13:11:35.653108 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 13:11:37 crc kubenswrapper[4757]: I1216 13:11:37.119878 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:37 crc kubenswrapper[4757]: I1216 13:11:37.151655 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 13:11:37 crc kubenswrapper[4757]: I1216 13:11:37.152669 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 13:11:37 crc kubenswrapper[4757]: I1216 13:11:37.158371 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 13:11:37 crc kubenswrapper[4757]: I1216 13:11:37.162417 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 13:11:37 crc kubenswrapper[4757]: I1216 13:11:37.670493 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 13:11:37 crc kubenswrapper[4757]: I1216 13:11:37.674990 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 13:11:37 crc kubenswrapper[4757]: I1216 13:11:37.878516 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-5gkh8"] Dec 16 13:11:37 crc kubenswrapper[4757]: I1216 13:11:37.885473 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:37 crc kubenswrapper[4757]: I1216 13:11:37.893917 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-5gkh8"] Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.237480 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x8xc\" (UniqueName: \"kubernetes.io/projected/b9fe3303-7f1d-4f67-8a51-8276430fb66b-kube-api-access-2x8xc\") pod \"dnsmasq-dns-cd5cbd7b9-5gkh8\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.237706 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-5gkh8\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.237783 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-config\") pod \"dnsmasq-dns-cd5cbd7b9-5gkh8\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.237835 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-5gkh8\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.237871 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-5gkh8\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.238194 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-5gkh8\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.339253 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-5gkh8\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.339298 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-config\") pod \"dnsmasq-dns-cd5cbd7b9-5gkh8\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.339320 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-5gkh8\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.339342 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-5gkh8\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.339426 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-5gkh8\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.339466 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x8xc\" (UniqueName: \"kubernetes.io/projected/b9fe3303-7f1d-4f67-8a51-8276430fb66b-kube-api-access-2x8xc\") pod \"dnsmasq-dns-cd5cbd7b9-5gkh8\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.340571 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-5gkh8\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.340578 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-config\") pod \"dnsmasq-dns-cd5cbd7b9-5gkh8\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.340697 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-5gkh8\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.340695 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-5gkh8\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.341364 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-5gkh8\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.385908 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x8xc\" (UniqueName: \"kubernetes.io/projected/b9fe3303-7f1d-4f67-8a51-8276430fb66b-kube-api-access-2x8xc\") pod \"dnsmasq-dns-cd5cbd7b9-5gkh8\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.464905 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:38 crc kubenswrapper[4757]: I1216 13:11:38.534658 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 16 13:11:39 crc kubenswrapper[4757]: I1216 13:11:39.011948 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-5gkh8"] Dec 16 13:11:39 crc kubenswrapper[4757]: W1216 13:11:39.030432 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9fe3303_7f1d_4f67_8a51_8276430fb66b.slice/crio-6b8760bbb265a39e403260115ffd4ca19fbe4d17f570a0bdeebe160c0ad18f43 WatchSource:0}: Error finding container 6b8760bbb265a39e403260115ffd4ca19fbe4d17f570a0bdeebe160c0ad18f43: Status 404 returned error can't find the container with id 6b8760bbb265a39e403260115ffd4ca19fbe4d17f570a0bdeebe160c0ad18f43 Dec 16 13:11:39 crc kubenswrapper[4757]: I1216 13:11:39.689676 4757 generic.go:334] "Generic (PLEG): container finished" podID="b9fe3303-7f1d-4f67-8a51-8276430fb66b" containerID="b580e4f56ee4aa29c2341d3e1f6841f23e5031ff5ae51b95b1dfa171266368f5" exitCode=0 Dec 16 13:11:39 crc kubenswrapper[4757]: I1216 13:11:39.689779 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" event={"ID":"b9fe3303-7f1d-4f67-8a51-8276430fb66b","Type":"ContainerDied","Data":"b580e4f56ee4aa29c2341d3e1f6841f23e5031ff5ae51b95b1dfa171266368f5"} Dec 16 13:11:39 crc kubenswrapper[4757]: I1216 13:11:39.690092 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" event={"ID":"b9fe3303-7f1d-4f67-8a51-8276430fb66b","Type":"ContainerStarted","Data":"6b8760bbb265a39e403260115ffd4ca19fbe4d17f570a0bdeebe160c0ad18f43"} Dec 16 13:11:40 crc kubenswrapper[4757]: I1216 13:11:40.701877 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" event={"ID":"b9fe3303-7f1d-4f67-8a51-8276430fb66b","Type":"ContainerStarted","Data":"8a20fb86812f562f1ac35b77ebad8b3889603f3b2960c45a5c438321058abaae"} Dec 16 13:11:40 crc kubenswrapper[4757]: I1216 13:11:40.702453 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:40 crc kubenswrapper[4757]: I1216 13:11:40.728766 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" podStartSLOduration=3.728743148 podStartE2EDuration="3.728743148s" podCreationTimestamp="2025-12-16 13:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:11:40.722963325 +0000 UTC m=+1486.150707121" watchObservedRunningTime="2025-12-16 13:11:40.728743148 +0000 UTC m=+1486.156486954" Dec 16 13:11:40 crc kubenswrapper[4757]: I1216 13:11:40.920987 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 13:11:40 crc kubenswrapper[4757]: I1216 13:11:40.921229 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5c8a1f83-5373-486b-b6a5-f5631745dcf9" containerName="nova-api-log" containerID="cri-o://31bb3f53b943cf96c323d7c5c5d16820263552932ccb3f2214803c3b91c4676f" gracePeriod=30 Dec 16 13:11:40 crc kubenswrapper[4757]: I1216 13:11:40.921497 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5c8a1f83-5373-486b-b6a5-f5631745dcf9" containerName="nova-api-api" containerID="cri-o://fdc6dd78342cd5cdcf02c11acc6cca93513f52dcaaee1907c98516e80d1cbfe5" gracePeriod=30 Dec 16 13:11:41 crc kubenswrapper[4757]: I1216 13:11:41.210017 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:11:41 crc kubenswrapper[4757]: I1216 13:11:41.210578 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1136893-32fb-41f6-97cb-466c3819677e" containerName="ceilometer-central-agent" containerID="cri-o://7b4f63ec544be84607e13fca6f7af182fff198df1412bad821dfcf5fb71b7036" gracePeriod=30 Dec 16 13:11:41 crc kubenswrapper[4757]: I1216 13:11:41.210600 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1136893-32fb-41f6-97cb-466c3819677e" containerName="proxy-httpd" containerID="cri-o://733ccb39b344b2e19f7c1f3b3a135db87e77a1c83326e4a9f6e5953b3bba18b0" gracePeriod=30 Dec 16 13:11:41 crc kubenswrapper[4757]: I1216 13:11:41.210684 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1136893-32fb-41f6-97cb-466c3819677e" containerName="ceilometer-notification-agent" containerID="cri-o://5cc761176fb7cb480d4c7f47eef52e66a5c01b310b415871e17956c843c8039c" gracePeriod=30 Dec 16 13:11:41 crc kubenswrapper[4757]: I1216 13:11:41.210690 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1136893-32fb-41f6-97cb-466c3819677e" containerName="sg-core" containerID="cri-o://a214e4af06ed04b0c7b5cee312f0e94154e14e4a18430420bf28f9a8248acad5" gracePeriod=30 Dec 16 13:11:41 crc kubenswrapper[4757]: I1216 13:11:41.713469 4757 generic.go:334] "Generic (PLEG): container finished" podID="d1136893-32fb-41f6-97cb-466c3819677e" containerID="733ccb39b344b2e19f7c1f3b3a135db87e77a1c83326e4a9f6e5953b3bba18b0" exitCode=0 Dec 16 13:11:41 crc kubenswrapper[4757]: I1216 13:11:41.713496 4757 generic.go:334] "Generic (PLEG): container finished" podID="d1136893-32fb-41f6-97cb-466c3819677e" containerID="a214e4af06ed04b0c7b5cee312f0e94154e14e4a18430420bf28f9a8248acad5" exitCode=2 Dec 16 13:11:41 crc kubenswrapper[4757]: I1216 13:11:41.713504 4757 generic.go:334] "Generic (PLEG): container finished" podID="d1136893-32fb-41f6-97cb-466c3819677e" containerID="7b4f63ec544be84607e13fca6f7af182fff198df1412bad821dfcf5fb71b7036" exitCode=0 Dec 16 13:11:41 crc kubenswrapper[4757]: I1216 13:11:41.713547 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1136893-32fb-41f6-97cb-466c3819677e","Type":"ContainerDied","Data":"733ccb39b344b2e19f7c1f3b3a135db87e77a1c83326e4a9f6e5953b3bba18b0"} Dec 16 13:11:41 crc kubenswrapper[4757]: I1216 13:11:41.713572 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1136893-32fb-41f6-97cb-466c3819677e","Type":"ContainerDied","Data":"a214e4af06ed04b0c7b5cee312f0e94154e14e4a18430420bf28f9a8248acad5"} Dec 16 13:11:41 crc kubenswrapper[4757]: I1216 13:11:41.713583 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1136893-32fb-41f6-97cb-466c3819677e","Type":"ContainerDied","Data":"7b4f63ec544be84607e13fca6f7af182fff198df1412bad821dfcf5fb71b7036"} Dec 16 13:11:41 crc kubenswrapper[4757]: I1216 13:11:41.716809 4757 generic.go:334] "Generic (PLEG): container finished" podID="5c8a1f83-5373-486b-b6a5-f5631745dcf9" containerID="31bb3f53b943cf96c323d7c5c5d16820263552932ccb3f2214803c3b91c4676f" exitCode=143 Dec 16 13:11:41 crc kubenswrapper[4757]: I1216 13:11:41.717628 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c8a1f83-5373-486b-b6a5-f5631745dcf9","Type":"ContainerDied","Data":"31bb3f53b943cf96c323d7c5c5d16820263552932ccb3f2214803c3b91c4676f"} Dec 16 13:11:42 crc kubenswrapper[4757]: I1216 13:11:42.118925 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:42 crc kubenswrapper[4757]: I1216 13:11:42.137753 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:42 crc kubenswrapper[4757]: I1216 13:11:42.743817 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 16 13:11:42 crc kubenswrapper[4757]: I1216 13:11:42.958879 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-vkdzh"] Dec 16 13:11:42 crc kubenswrapper[4757]: I1216 13:11:42.960356 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vkdzh" Dec 16 13:11:42 crc kubenswrapper[4757]: I1216 13:11:42.962442 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 16 13:11:42 crc kubenswrapper[4757]: I1216 13:11:42.962501 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 16 13:11:43 crc kubenswrapper[4757]: I1216 13:11:42.973735 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vkdzh"] Dec 16 13:11:43 crc kubenswrapper[4757]: I1216 13:11:43.142604 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh8vz\" (UniqueName: \"kubernetes.io/projected/849962c6-8103-4d96-8136-23acb6221049-kube-api-access-rh8vz\") pod \"nova-cell1-cell-mapping-vkdzh\" (UID: \"849962c6-8103-4d96-8136-23acb6221049\") " pod="openstack/nova-cell1-cell-mapping-vkdzh" Dec 16 13:11:43 crc kubenswrapper[4757]: I1216 13:11:43.142706 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/849962c6-8103-4d96-8136-23acb6221049-config-data\") pod \"nova-cell1-cell-mapping-vkdzh\" (UID: \"849962c6-8103-4d96-8136-23acb6221049\") " pod="openstack/nova-cell1-cell-mapping-vkdzh" Dec 16 13:11:43 crc kubenswrapper[4757]: I1216 13:11:43.142747 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/849962c6-8103-4d96-8136-23acb6221049-scripts\") pod \"nova-cell1-cell-mapping-vkdzh\" (UID: \"849962c6-8103-4d96-8136-23acb6221049\") " pod="openstack/nova-cell1-cell-mapping-vkdzh" Dec 16 13:11:43 crc kubenswrapper[4757]: I1216 13:11:43.142823 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849962c6-8103-4d96-8136-23acb6221049-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vkdzh\" (UID: \"849962c6-8103-4d96-8136-23acb6221049\") " pod="openstack/nova-cell1-cell-mapping-vkdzh" Dec 16 13:11:43 crc kubenswrapper[4757]: I1216 13:11:43.244958 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/849962c6-8103-4d96-8136-23acb6221049-scripts\") pod \"nova-cell1-cell-mapping-vkdzh\" (UID: \"849962c6-8103-4d96-8136-23acb6221049\") " pod="openstack/nova-cell1-cell-mapping-vkdzh" Dec 16 13:11:43 crc kubenswrapper[4757]: I1216 13:11:43.245095 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849962c6-8103-4d96-8136-23acb6221049-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vkdzh\" (UID: \"849962c6-8103-4d96-8136-23acb6221049\") " pod="openstack/nova-cell1-cell-mapping-vkdzh" Dec 16 13:11:43 crc kubenswrapper[4757]: I1216 13:11:43.245176 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh8vz\" (UniqueName: \"kubernetes.io/projected/849962c6-8103-4d96-8136-23acb6221049-kube-api-access-rh8vz\") pod \"nova-cell1-cell-mapping-vkdzh\" (UID: \"849962c6-8103-4d96-8136-23acb6221049\") " pod="openstack/nova-cell1-cell-mapping-vkdzh" Dec 16 13:11:43 crc kubenswrapper[4757]: I1216 13:11:43.245276 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/849962c6-8103-4d96-8136-23acb6221049-config-data\") pod \"nova-cell1-cell-mapping-vkdzh\" (UID: \"849962c6-8103-4d96-8136-23acb6221049\") " pod="openstack/nova-cell1-cell-mapping-vkdzh" Dec 16 13:11:43 crc kubenswrapper[4757]: I1216 13:11:43.251683 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/849962c6-8103-4d96-8136-23acb6221049-scripts\") pod \"nova-cell1-cell-mapping-vkdzh\" (UID: \"849962c6-8103-4d96-8136-23acb6221049\") " pod="openstack/nova-cell1-cell-mapping-vkdzh" Dec 16 13:11:43 crc kubenswrapper[4757]: I1216 13:11:43.252053 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/849962c6-8103-4d96-8136-23acb6221049-config-data\") pod \"nova-cell1-cell-mapping-vkdzh\" (UID: \"849962c6-8103-4d96-8136-23acb6221049\") " pod="openstack/nova-cell1-cell-mapping-vkdzh" Dec 16 13:11:43 crc kubenswrapper[4757]: I1216 13:11:43.252495 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849962c6-8103-4d96-8136-23acb6221049-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vkdzh\" (UID: \"849962c6-8103-4d96-8136-23acb6221049\") " pod="openstack/nova-cell1-cell-mapping-vkdzh" Dec 16 13:11:43 crc kubenswrapper[4757]: I1216 13:11:43.267892 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh8vz\" (UniqueName: \"kubernetes.io/projected/849962c6-8103-4d96-8136-23acb6221049-kube-api-access-rh8vz\") pod \"nova-cell1-cell-mapping-vkdzh\" (UID: \"849962c6-8103-4d96-8136-23acb6221049\") " pod="openstack/nova-cell1-cell-mapping-vkdzh" Dec 16 13:11:43 crc kubenswrapper[4757]: I1216 13:11:43.352737 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vkdzh" Dec 16 13:11:43 crc kubenswrapper[4757]: I1216 13:11:43.833670 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vkdzh"] Dec 16 13:11:43 crc kubenswrapper[4757]: W1216 13:11:43.841907 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod849962c6_8103_4d96_8136_23acb6221049.slice/crio-a9bc24964f179fe4feb48838055a3a3fbc1066c57a869b94dc8345c7c5059265 WatchSource:0}: Error finding container a9bc24964f179fe4feb48838055a3a3fbc1066c57a869b94dc8345c7c5059265: Status 404 returned error can't find the container with id a9bc24964f179fe4feb48838055a3a3fbc1066c57a869b94dc8345c7c5059265 Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.581450 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.703206 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c8a1f83-5373-486b-b6a5-f5631745dcf9-logs\") pod \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\" (UID: \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\") " Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.703421 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8a1f83-5373-486b-b6a5-f5631745dcf9-config-data\") pod \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\" (UID: \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\") " Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.703479 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lq4h\" (UniqueName: \"kubernetes.io/projected/5c8a1f83-5373-486b-b6a5-f5631745dcf9-kube-api-access-8lq4h\") pod \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\" (UID: \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\") " Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.703536 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8a1f83-5373-486b-b6a5-f5631745dcf9-combined-ca-bundle\") pod \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\" (UID: \"5c8a1f83-5373-486b-b6a5-f5631745dcf9\") " Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.704765 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c8a1f83-5373-486b-b6a5-f5631745dcf9-logs" (OuterVolumeSpecName: "logs") pod "5c8a1f83-5373-486b-b6a5-f5631745dcf9" (UID: "5c8a1f83-5373-486b-b6a5-f5631745dcf9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.721330 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c8a1f83-5373-486b-b6a5-f5631745dcf9-kube-api-access-8lq4h" (OuterVolumeSpecName: "kube-api-access-8lq4h") pod "5c8a1f83-5373-486b-b6a5-f5631745dcf9" (UID: "5c8a1f83-5373-486b-b6a5-f5631745dcf9"). InnerVolumeSpecName "kube-api-access-8lq4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.743727 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c8a1f83-5373-486b-b6a5-f5631745dcf9-config-data" (OuterVolumeSpecName: "config-data") pod "5c8a1f83-5373-486b-b6a5-f5631745dcf9" (UID: "5c8a1f83-5373-486b-b6a5-f5631745dcf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.751191 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c8a1f83-5373-486b-b6a5-f5631745dcf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c8a1f83-5373-486b-b6a5-f5631745dcf9" (UID: "5c8a1f83-5373-486b-b6a5-f5631745dcf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.754761 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vkdzh" event={"ID":"849962c6-8103-4d96-8136-23acb6221049","Type":"ContainerStarted","Data":"7615e52cc5d43f93c458b907c7cd86fadf052d622b2f992133ab915df9bd2a88"} Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.754978 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vkdzh" event={"ID":"849962c6-8103-4d96-8136-23acb6221049","Type":"ContainerStarted","Data":"a9bc24964f179fe4feb48838055a3a3fbc1066c57a869b94dc8345c7c5059265"} Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.775845 4757 generic.go:334] "Generic (PLEG): container finished" podID="5c8a1f83-5373-486b-b6a5-f5631745dcf9" containerID="fdc6dd78342cd5cdcf02c11acc6cca93513f52dcaaee1907c98516e80d1cbfe5" exitCode=0 Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.775892 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c8a1f83-5373-486b-b6a5-f5631745dcf9","Type":"ContainerDied","Data":"fdc6dd78342cd5cdcf02c11acc6cca93513f52dcaaee1907c98516e80d1cbfe5"} Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.775922 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c8a1f83-5373-486b-b6a5-f5631745dcf9","Type":"ContainerDied","Data":"abb120809f8e2e9336d34e29da0e6ce855b6724dd7f76f15f358396ef0876748"} Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.775932 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.775943 4757 scope.go:117] "RemoveContainer" containerID="fdc6dd78342cd5cdcf02c11acc6cca93513f52dcaaee1907c98516e80d1cbfe5" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.801988 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-vkdzh" podStartSLOduration=2.801971527 podStartE2EDuration="2.801971527s" podCreationTimestamp="2025-12-16 13:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:11:44.791560468 +0000 UTC m=+1490.219304284" watchObservedRunningTime="2025-12-16 13:11:44.801971527 +0000 UTC m=+1490.229715323" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.807556 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8a1f83-5373-486b-b6a5-f5631745dcf9-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.807583 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lq4h\" (UniqueName: \"kubernetes.io/projected/5c8a1f83-5373-486b-b6a5-f5631745dcf9-kube-api-access-8lq4h\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.807593 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8a1f83-5373-486b-b6a5-f5631745dcf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.807602 4757 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c8a1f83-5373-486b-b6a5-f5631745dcf9-logs\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.860848 4757 scope.go:117] "RemoveContainer" containerID="31bb3f53b943cf96c323d7c5c5d16820263552932ccb3f2214803c3b91c4676f" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.863224 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.889434 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.908154 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 13:11:44 crc kubenswrapper[4757]: E1216 13:11:44.908771 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8a1f83-5373-486b-b6a5-f5631745dcf9" containerName="nova-api-api" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.908789 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8a1f83-5373-486b-b6a5-f5631745dcf9" containerName="nova-api-api" Dec 16 13:11:44 crc kubenswrapper[4757]: E1216 13:11:44.908874 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8a1f83-5373-486b-b6a5-f5631745dcf9" containerName="nova-api-log" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.908884 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8a1f83-5373-486b-b6a5-f5631745dcf9" containerName="nova-api-log" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.909154 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c8a1f83-5373-486b-b6a5-f5631745dcf9" containerName="nova-api-log" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.909192 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c8a1f83-5373-486b-b6a5-f5631745dcf9" containerName="nova-api-api" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.911213 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.911501 4757 scope.go:117] "RemoveContainer" containerID="fdc6dd78342cd5cdcf02c11acc6cca93513f52dcaaee1907c98516e80d1cbfe5" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.915285 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.915457 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 16 13:11:44 crc kubenswrapper[4757]: E1216 13:11:44.918921 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdc6dd78342cd5cdcf02c11acc6cca93513f52dcaaee1907c98516e80d1cbfe5\": container with ID starting with fdc6dd78342cd5cdcf02c11acc6cca93513f52dcaaee1907c98516e80d1cbfe5 not found: ID does not exist" containerID="fdc6dd78342cd5cdcf02c11acc6cca93513f52dcaaee1907c98516e80d1cbfe5" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.918951 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdc6dd78342cd5cdcf02c11acc6cca93513f52dcaaee1907c98516e80d1cbfe5"} err="failed to get container status \"fdc6dd78342cd5cdcf02c11acc6cca93513f52dcaaee1907c98516e80d1cbfe5\": rpc error: code = NotFound desc = could not find container \"fdc6dd78342cd5cdcf02c11acc6cca93513f52dcaaee1907c98516e80d1cbfe5\": container with ID starting with fdc6dd78342cd5cdcf02c11acc6cca93513f52dcaaee1907c98516e80d1cbfe5 not found: ID does not exist" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.918973 4757 scope.go:117] "RemoveContainer" containerID="31bb3f53b943cf96c323d7c5c5d16820263552932ccb3f2214803c3b91c4676f" Dec 16 13:11:44 crc kubenswrapper[4757]: E1216 13:11:44.919292 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31bb3f53b943cf96c323d7c5c5d16820263552932ccb3f2214803c3b91c4676f\": container with ID starting with 31bb3f53b943cf96c323d7c5c5d16820263552932ccb3f2214803c3b91c4676f not found: ID does not exist" containerID="31bb3f53b943cf96c323d7c5c5d16820263552932ccb3f2214803c3b91c4676f" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.919309 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31bb3f53b943cf96c323d7c5c5d16820263552932ccb3f2214803c3b91c4676f"} err="failed to get container status \"31bb3f53b943cf96c323d7c5c5d16820263552932ccb3f2214803c3b91c4676f\": rpc error: code = NotFound desc = could not find container \"31bb3f53b943cf96c323d7c5c5d16820263552932ccb3f2214803c3b91c4676f\": container with ID starting with 31bb3f53b943cf96c323d7c5c5d16820263552932ccb3f2214803c3b91c4676f not found: ID does not exist" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.919737 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.923293 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 13:11:44 crc kubenswrapper[4757]: I1216 13:11:44.968404 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c8a1f83-5373-486b-b6a5-f5631745dcf9" path="/var/lib/kubelet/pods/5c8a1f83-5373-486b-b6a5-f5631745dcf9/volumes" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.011740 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvqmt\" (UniqueName: \"kubernetes.io/projected/577dca3f-5fa8-429a-a3fd-5913f0acd836-kube-api-access-xvqmt\") pod \"nova-api-0\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.012718 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-config-data\") pod \"nova-api-0\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.012814 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.012881 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-internal-tls-certs\") pod \"nova-api-0\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.012941 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-public-tls-certs\") pod \"nova-api-0\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.013121 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/577dca3f-5fa8-429a-a3fd-5913f0acd836-logs\") pod \"nova-api-0\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.114627 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/577dca3f-5fa8-429a-a3fd-5913f0acd836-logs\") pod \"nova-api-0\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.114706 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvqmt\" (UniqueName: \"kubernetes.io/projected/577dca3f-5fa8-429a-a3fd-5913f0acd836-kube-api-access-xvqmt\") pod \"nova-api-0\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.114815 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-config-data\") pod \"nova-api-0\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.114848 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.114877 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-internal-tls-certs\") pod \"nova-api-0\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.114901 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-public-tls-certs\") pod \"nova-api-0\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.116198 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/577dca3f-5fa8-429a-a3fd-5913f0acd836-logs\") pod \"nova-api-0\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.120036 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-config-data\") pod \"nova-api-0\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.120273 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-internal-tls-certs\") pod \"nova-api-0\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.120607 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.121496 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-public-tls-certs\") pod \"nova-api-0\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.137742 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvqmt\" (UniqueName: \"kubernetes.io/projected/577dca3f-5fa8-429a-a3fd-5913f0acd836-kube-api-access-xvqmt\") pod \"nova-api-0\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.249715 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.541461 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.623165 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-config-data\") pod \"d1136893-32fb-41f6-97cb-466c3819677e\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.623256 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-sg-core-conf-yaml\") pod \"d1136893-32fb-41f6-97cb-466c3819677e\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.623285 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-combined-ca-bundle\") pod \"d1136893-32fb-41f6-97cb-466c3819677e\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.623480 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-scripts\") pod \"d1136893-32fb-41f6-97cb-466c3819677e\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.623526 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-ceilometer-tls-certs\") pod \"d1136893-32fb-41f6-97cb-466c3819677e\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.623575 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpmpn\" (UniqueName: \"kubernetes.io/projected/d1136893-32fb-41f6-97cb-466c3819677e-kube-api-access-xpmpn\") pod \"d1136893-32fb-41f6-97cb-466c3819677e\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.623611 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1136893-32fb-41f6-97cb-466c3819677e-log-httpd\") pod \"d1136893-32fb-41f6-97cb-466c3819677e\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.623631 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1136893-32fb-41f6-97cb-466c3819677e-run-httpd\") pod \"d1136893-32fb-41f6-97cb-466c3819677e\" (UID: \"d1136893-32fb-41f6-97cb-466c3819677e\") " Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.625893 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1136893-32fb-41f6-97cb-466c3819677e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d1136893-32fb-41f6-97cb-466c3819677e" (UID: "d1136893-32fb-41f6-97cb-466c3819677e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.630936 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-scripts" (OuterVolumeSpecName: "scripts") pod "d1136893-32fb-41f6-97cb-466c3819677e" (UID: "d1136893-32fb-41f6-97cb-466c3819677e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.631527 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1136893-32fb-41f6-97cb-466c3819677e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d1136893-32fb-41f6-97cb-466c3819677e" (UID: "d1136893-32fb-41f6-97cb-466c3819677e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.636112 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1136893-32fb-41f6-97cb-466c3819677e-kube-api-access-xpmpn" (OuterVolumeSpecName: "kube-api-access-xpmpn") pod "d1136893-32fb-41f6-97cb-466c3819677e" (UID: "d1136893-32fb-41f6-97cb-466c3819677e"). InnerVolumeSpecName "kube-api-access-xpmpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.682649 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d1136893-32fb-41f6-97cb-466c3819677e" (UID: "d1136893-32fb-41f6-97cb-466c3819677e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.730166 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.730207 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpmpn\" (UniqueName: \"kubernetes.io/projected/d1136893-32fb-41f6-97cb-466c3819677e-kube-api-access-xpmpn\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.731064 4757 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1136893-32fb-41f6-97cb-466c3819677e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.731079 4757 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1136893-32fb-41f6-97cb-466c3819677e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.731087 4757 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.771113 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d1136893-32fb-41f6-97cb-466c3819677e" (UID: "d1136893-32fb-41f6-97cb-466c3819677e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.832627 4757 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.861772 4757 generic.go:334] "Generic (PLEG): container finished" podID="d1136893-32fb-41f6-97cb-466c3819677e" containerID="5cc761176fb7cb480d4c7f47eef52e66a5c01b310b415871e17956c843c8039c" exitCode=0 Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.861878 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1136893-32fb-41f6-97cb-466c3819677e","Type":"ContainerDied","Data":"5cc761176fb7cb480d4c7f47eef52e66a5c01b310b415871e17956c843c8039c"} Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.861912 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1136893-32fb-41f6-97cb-466c3819677e","Type":"ContainerDied","Data":"37b7f49bbbdc53559c7b19796c3bf27008270cf1cef18fed9bab2af69e02a064"} Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.861930 4757 scope.go:117] "RemoveContainer" containerID="733ccb39b344b2e19f7c1f3b3a135db87e77a1c83326e4a9f6e5953b3bba18b0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.862178 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.884784 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-config-data" (OuterVolumeSpecName: "config-data") pod "d1136893-32fb-41f6-97cb-466c3819677e" (UID: "d1136893-32fb-41f6-97cb-466c3819677e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.888308 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1136893-32fb-41f6-97cb-466c3819677e" (UID: "d1136893-32fb-41f6-97cb-466c3819677e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.899346 4757 scope.go:117] "RemoveContainer" containerID="a214e4af06ed04b0c7b5cee312f0e94154e14e4a18430420bf28f9a8248acad5" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.937969 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.938036 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1136893-32fb-41f6-97cb-466c3819677e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.946158 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 13:11:45 crc kubenswrapper[4757]: I1216 13:11:45.979751 4757 scope.go:117] "RemoveContainer" containerID="5cc761176fb7cb480d4c7f47eef52e66a5c01b310b415871e17956c843c8039c" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.070842 4757 scope.go:117] "RemoveContainer" containerID="7b4f63ec544be84607e13fca6f7af182fff198df1412bad821dfcf5fb71b7036" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.102716 4757 scope.go:117] "RemoveContainer" containerID="733ccb39b344b2e19f7c1f3b3a135db87e77a1c83326e4a9f6e5953b3bba18b0" Dec 16 13:11:46 crc kubenswrapper[4757]: E1216 13:11:46.104455 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"733ccb39b344b2e19f7c1f3b3a135db87e77a1c83326e4a9f6e5953b3bba18b0\": container with ID starting with 733ccb39b344b2e19f7c1f3b3a135db87e77a1c83326e4a9f6e5953b3bba18b0 not found: ID does not exist" containerID="733ccb39b344b2e19f7c1f3b3a135db87e77a1c83326e4a9f6e5953b3bba18b0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.104500 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"733ccb39b344b2e19f7c1f3b3a135db87e77a1c83326e4a9f6e5953b3bba18b0"} err="failed to get container status \"733ccb39b344b2e19f7c1f3b3a135db87e77a1c83326e4a9f6e5953b3bba18b0\": rpc error: code = NotFound desc = could not find container \"733ccb39b344b2e19f7c1f3b3a135db87e77a1c83326e4a9f6e5953b3bba18b0\": container with ID starting with 733ccb39b344b2e19f7c1f3b3a135db87e77a1c83326e4a9f6e5953b3bba18b0 not found: ID does not exist" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.104526 4757 scope.go:117] "RemoveContainer" containerID="a214e4af06ed04b0c7b5cee312f0e94154e14e4a18430420bf28f9a8248acad5" Dec 16 13:11:46 crc kubenswrapper[4757]: E1216 13:11:46.106133 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a214e4af06ed04b0c7b5cee312f0e94154e14e4a18430420bf28f9a8248acad5\": container with ID starting with a214e4af06ed04b0c7b5cee312f0e94154e14e4a18430420bf28f9a8248acad5 not found: ID does not exist" containerID="a214e4af06ed04b0c7b5cee312f0e94154e14e4a18430420bf28f9a8248acad5" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.106164 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a214e4af06ed04b0c7b5cee312f0e94154e14e4a18430420bf28f9a8248acad5"} err="failed to get container status \"a214e4af06ed04b0c7b5cee312f0e94154e14e4a18430420bf28f9a8248acad5\": rpc error: code = NotFound desc = could not find container \"a214e4af06ed04b0c7b5cee312f0e94154e14e4a18430420bf28f9a8248acad5\": container with ID starting with a214e4af06ed04b0c7b5cee312f0e94154e14e4a18430420bf28f9a8248acad5 not found: ID does not exist" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.106181 4757 scope.go:117] "RemoveContainer" containerID="5cc761176fb7cb480d4c7f47eef52e66a5c01b310b415871e17956c843c8039c" Dec 16 13:11:46 crc kubenswrapper[4757]: E1216 13:11:46.106505 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cc761176fb7cb480d4c7f47eef52e66a5c01b310b415871e17956c843c8039c\": container with ID starting with 5cc761176fb7cb480d4c7f47eef52e66a5c01b310b415871e17956c843c8039c not found: ID does not exist" containerID="5cc761176fb7cb480d4c7f47eef52e66a5c01b310b415871e17956c843c8039c" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.106553 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc761176fb7cb480d4c7f47eef52e66a5c01b310b415871e17956c843c8039c"} err="failed to get container status \"5cc761176fb7cb480d4c7f47eef52e66a5c01b310b415871e17956c843c8039c\": rpc error: code = NotFound desc = could not find container \"5cc761176fb7cb480d4c7f47eef52e66a5c01b310b415871e17956c843c8039c\": container with ID starting with 5cc761176fb7cb480d4c7f47eef52e66a5c01b310b415871e17956c843c8039c not found: ID does not exist" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.106584 4757 scope.go:117] "RemoveContainer" containerID="7b4f63ec544be84607e13fca6f7af182fff198df1412bad821dfcf5fb71b7036" Dec 16 13:11:46 crc kubenswrapper[4757]: E1216 13:11:46.111366 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b4f63ec544be84607e13fca6f7af182fff198df1412bad821dfcf5fb71b7036\": container with ID starting with 7b4f63ec544be84607e13fca6f7af182fff198df1412bad821dfcf5fb71b7036 not found: ID does not exist" containerID="7b4f63ec544be84607e13fca6f7af182fff198df1412bad821dfcf5fb71b7036" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.111398 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b4f63ec544be84607e13fca6f7af182fff198df1412bad821dfcf5fb71b7036"} err="failed to get container status \"7b4f63ec544be84607e13fca6f7af182fff198df1412bad821dfcf5fb71b7036\": rpc error: code = NotFound desc = could not find container \"7b4f63ec544be84607e13fca6f7af182fff198df1412bad821dfcf5fb71b7036\": container with ID starting with 7b4f63ec544be84607e13fca6f7af182fff198df1412bad821dfcf5fb71b7036 not found: ID does not exist" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.221095 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.235451 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.243862 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:11:46 crc kubenswrapper[4757]: E1216 13:11:46.252232 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1136893-32fb-41f6-97cb-466c3819677e" containerName="ceilometer-notification-agent" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.252440 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1136893-32fb-41f6-97cb-466c3819677e" containerName="ceilometer-notification-agent" Dec 16 13:11:46 crc kubenswrapper[4757]: E1216 13:11:46.252524 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1136893-32fb-41f6-97cb-466c3819677e" containerName="sg-core" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.252592 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1136893-32fb-41f6-97cb-466c3819677e" containerName="sg-core" Dec 16 13:11:46 crc kubenswrapper[4757]: E1216 13:11:46.252684 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1136893-32fb-41f6-97cb-466c3819677e" containerName="proxy-httpd" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.252773 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1136893-32fb-41f6-97cb-466c3819677e" containerName="proxy-httpd" Dec 16 13:11:46 crc kubenswrapper[4757]: E1216 13:11:46.252848 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1136893-32fb-41f6-97cb-466c3819677e" containerName="ceilometer-central-agent" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.252915 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1136893-32fb-41f6-97cb-466c3819677e" containerName="ceilometer-central-agent" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.253215 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1136893-32fb-41f6-97cb-466c3819677e" containerName="proxy-httpd" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.253296 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1136893-32fb-41f6-97cb-466c3819677e" containerName="ceilometer-notification-agent" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.253365 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1136893-32fb-41f6-97cb-466c3819677e" containerName="sg-core" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.253443 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1136893-32fb-41f6-97cb-466c3819677e" containerName="ceilometer-central-agent" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.255446 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.262230 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.262352 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.262719 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.295613 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.366250 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65a6f73f-1823-407a-a9f2-c693e5ddcca9-run-httpd\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.367155 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65a6f73f-1823-407a-a9f2-c693e5ddcca9-scripts\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.367633 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65a6f73f-1823-407a-a9f2-c693e5ddcca9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.370603 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlpm6\" (UniqueName: \"kubernetes.io/projected/65a6f73f-1823-407a-a9f2-c693e5ddcca9-kube-api-access-wlpm6\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.371409 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a6f73f-1823-407a-a9f2-c693e5ddcca9-config-data\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.372287 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65a6f73f-1823-407a-a9f2-c693e5ddcca9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.372759 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a6f73f-1823-407a-a9f2-c693e5ddcca9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.374353 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65a6f73f-1823-407a-a9f2-c693e5ddcca9-log-httpd\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.478660 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a6f73f-1823-407a-a9f2-c693e5ddcca9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.478753 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65a6f73f-1823-407a-a9f2-c693e5ddcca9-log-httpd\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.478780 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65a6f73f-1823-407a-a9f2-c693e5ddcca9-run-httpd\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.478838 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65a6f73f-1823-407a-a9f2-c693e5ddcca9-scripts\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.478878 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65a6f73f-1823-407a-a9f2-c693e5ddcca9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.478966 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a6f73f-1823-407a-a9f2-c693e5ddcca9-config-data\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.478990 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlpm6\" (UniqueName: \"kubernetes.io/projected/65a6f73f-1823-407a-a9f2-c693e5ddcca9-kube-api-access-wlpm6\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.479045 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65a6f73f-1823-407a-a9f2-c693e5ddcca9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.481235 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65a6f73f-1823-407a-a9f2-c693e5ddcca9-log-httpd\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.483934 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65a6f73f-1823-407a-a9f2-c693e5ddcca9-run-httpd\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.485091 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a6f73f-1823-407a-a9f2-c693e5ddcca9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.485425 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65a6f73f-1823-407a-a9f2-c693e5ddcca9-scripts\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.486122 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65a6f73f-1823-407a-a9f2-c693e5ddcca9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.487125 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a6f73f-1823-407a-a9f2-c693e5ddcca9-config-data\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.494679 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65a6f73f-1823-407a-a9f2-c693e5ddcca9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.507240 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlpm6\" (UniqueName: \"kubernetes.io/projected/65a6f73f-1823-407a-a9f2-c693e5ddcca9-kube-api-access-wlpm6\") pod \"ceilometer-0\" (UID: \"65a6f73f-1823-407a-a9f2-c693e5ddcca9\") " pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.558347 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.907922 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"577dca3f-5fa8-429a-a3fd-5913f0acd836","Type":"ContainerStarted","Data":"ab24c6ca0319089d7a90e87725aa3faf795b768c3f535f3c5628e6e3b7002fca"} Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.908318 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"577dca3f-5fa8-429a-a3fd-5913f0acd836","Type":"ContainerStarted","Data":"de39c9dae387c9a7a0a0337645a1f8fda98b3c76c812fc6e19ebb15643aca6d7"} Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.908331 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"577dca3f-5fa8-429a-a3fd-5913f0acd836","Type":"ContainerStarted","Data":"8152bd5bd921449c848ac98981055688e6e4f227244588deffb1baceb77b1c11"} Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.947543 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.947519455 podStartE2EDuration="2.947519455s" podCreationTimestamp="2025-12-16 13:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:11:46.933963389 +0000 UTC m=+1492.361707195" watchObservedRunningTime="2025-12-16 13:11:46.947519455 +0000 UTC m=+1492.375263251" Dec 16 13:11:46 crc kubenswrapper[4757]: I1216 13:11:46.968034 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1136893-32fb-41f6-97cb-466c3819677e" path="/var/lib/kubelet/pods/d1136893-32fb-41f6-97cb-466c3819677e/volumes" Dec 16 13:11:47 crc kubenswrapper[4757]: I1216 13:11:47.074747 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 13:11:47 crc kubenswrapper[4757]: W1216 13:11:47.076140 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65a6f73f_1823_407a_a9f2_c693e5ddcca9.slice/crio-ef0097fa10deb4fca722e65b7b6284810ea0a15bcd5999d184cab46136a87690 WatchSource:0}: Error finding container ef0097fa10deb4fca722e65b7b6284810ea0a15bcd5999d184cab46136a87690: Status 404 returned error can't find the container with id ef0097fa10deb4fca722e65b7b6284810ea0a15bcd5999d184cab46136a87690 Dec 16 13:11:47 crc kubenswrapper[4757]: I1216 13:11:47.921111 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a6f73f-1823-407a-a9f2-c693e5ddcca9","Type":"ContainerStarted","Data":"efccb4b3e660304aaaf43832f8e13cbebaf5a9f019ac85d25fea4b90b97c3966"} Dec 16 13:11:47 crc kubenswrapper[4757]: I1216 13:11:47.921475 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a6f73f-1823-407a-a9f2-c693e5ddcca9","Type":"ContainerStarted","Data":"ef0097fa10deb4fca722e65b7b6284810ea0a15bcd5999d184cab46136a87690"} Dec 16 13:11:48 crc kubenswrapper[4757]: I1216 13:11:48.466834 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:11:48 crc kubenswrapper[4757]: I1216 13:11:48.529022 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-tsdxz"] Dec 16 13:11:48 crc kubenswrapper[4757]: I1216 13:11:48.529262 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" podUID="5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff" containerName="dnsmasq-dns" containerID="cri-o://58ff04a57656f775a9c1ef03e9c0bde8af2ec49fb928b2e9c092889d83a8dcbc" gracePeriod=10 Dec 16 13:11:48 crc kubenswrapper[4757]: I1216 13:11:48.943939 4757 generic.go:334] "Generic (PLEG): container finished" podID="5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff" containerID="58ff04a57656f775a9c1ef03e9c0bde8af2ec49fb928b2e9c092889d83a8dcbc" exitCode=0 Dec 16 13:11:48 crc kubenswrapper[4757]: I1216 13:11:48.944076 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" event={"ID":"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff","Type":"ContainerDied","Data":"58ff04a57656f775a9c1ef03e9c0bde8af2ec49fb928b2e9c092889d83a8dcbc"} Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.194719 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.267728 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-ovsdbserver-sb\") pod \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.267792 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsvpm\" (UniqueName: \"kubernetes.io/projected/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-kube-api-access-nsvpm\") pod \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.267997 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-config\") pod \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.268118 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-dns-svc\") pod \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.270329 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-dns-swift-storage-0\") pod \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.270730 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-ovsdbserver-nb\") pod \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\" (UID: \"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff\") " Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.273729 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-kube-api-access-nsvpm" (OuterVolumeSpecName: "kube-api-access-nsvpm") pod "5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff" (UID: "5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff"). InnerVolumeSpecName "kube-api-access-nsvpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.340377 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff" (UID: "5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.343630 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-config" (OuterVolumeSpecName: "config") pod "5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff" (UID: "5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.343690 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff" (UID: "5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.348063 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff" (UID: "5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.360659 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff" (UID: "5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.375788 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.375880 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.375896 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsvpm\" (UniqueName: \"kubernetes.io/projected/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-kube-api-access-nsvpm\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.375915 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.375927 4757 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.375940 4757 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.980882 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a6f73f-1823-407a-a9f2-c693e5ddcca9","Type":"ContainerStarted","Data":"dec1770f5c192f62e118934bead7638fcc2c0688f9c70c02c9368ca807f35240"} Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.983866 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" event={"ID":"5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff","Type":"ContainerDied","Data":"123e0a83426f420ee9acb931ae1e9f9d02502b2a2cdeae21408e14693f1a78e4"} Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.983918 4757 scope.go:117] "RemoveContainer" containerID="58ff04a57656f775a9c1ef03e9c0bde8af2ec49fb928b2e9c092889d83a8dcbc" Dec 16 13:11:49 crc kubenswrapper[4757]: I1216 13:11:49.984094 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-tsdxz" Dec 16 13:11:50 crc kubenswrapper[4757]: I1216 13:11:50.121605 4757 scope.go:117] "RemoveContainer" containerID="4bec5ae47d5ca16ebe65ad9171ac584d9eb1e957faaccfd36319aeb5243beb7f" Dec 16 13:11:50 crc kubenswrapper[4757]: I1216 13:11:50.224242 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-tsdxz"] Dec 16 13:11:50 crc kubenswrapper[4757]: I1216 13:11:50.255387 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-tsdxz"] Dec 16 13:11:50 crc kubenswrapper[4757]: I1216 13:11:50.964286 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff" path="/var/lib/kubelet/pods/5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff/volumes" Dec 16 13:11:50 crc kubenswrapper[4757]: I1216 13:11:50.998746 4757 generic.go:334] "Generic (PLEG): container finished" podID="849962c6-8103-4d96-8136-23acb6221049" containerID="7615e52cc5d43f93c458b907c7cd86fadf052d622b2f992133ab915df9bd2a88" exitCode=0 Dec 16 13:11:50 crc kubenswrapper[4757]: I1216 13:11:50.998832 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vkdzh" event={"ID":"849962c6-8103-4d96-8136-23acb6221049","Type":"ContainerDied","Data":"7615e52cc5d43f93c458b907c7cd86fadf052d622b2f992133ab915df9bd2a88"} Dec 16 13:11:51 crc kubenswrapper[4757]: I1216 13:11:51.002130 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a6f73f-1823-407a-a9f2-c693e5ddcca9","Type":"ContainerStarted","Data":"b125c282bdb73b2bb7c7a5ded6d0ec19f1c59a4f87ee7044b021cc5f4c037ade"} Dec 16 13:11:51 crc kubenswrapper[4757]: I1216 13:11:51.180857 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:11:51 crc kubenswrapper[4757]: I1216 13:11:51.180923 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:11:51 crc kubenswrapper[4757]: I1216 13:11:51.180994 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 13:11:51 crc kubenswrapper[4757]: I1216 13:11:51.181812 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 13:11:51 crc kubenswrapper[4757]: I1216 13:11:51.181906 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" gracePeriod=600 Dec 16 13:11:51 crc kubenswrapper[4757]: E1216 13:11:51.339967 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.023412 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" exitCode=0 Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.023465 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3"} Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.023822 4757 scope.go:117] "RemoveContainer" containerID="feaab26a71eb3b6535920da4cbeacb812adf972fd9cda852a626b3b8fac4ff4e" Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.024571 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:11:52 crc kubenswrapper[4757]: E1216 13:11:52.024893 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.027469 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a6f73f-1823-407a-a9f2-c693e5ddcca9","Type":"ContainerStarted","Data":"8b0166669430c734442e9499558cdbb867db9d2695d680e1441b82996ddc24c3"} Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.090653 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.646185407 podStartE2EDuration="6.090633205s" podCreationTimestamp="2025-12-16 13:11:46 +0000 UTC" firstStartedPulling="2025-12-16 13:11:47.079766842 +0000 UTC m=+1492.507510638" lastFinishedPulling="2025-12-16 13:11:51.52421464 +0000 UTC m=+1496.951958436" observedRunningTime="2025-12-16 13:11:52.07954477 +0000 UTC m=+1497.507288566" watchObservedRunningTime="2025-12-16 13:11:52.090633205 +0000 UTC m=+1497.518377001" Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.485213 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vkdzh" Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.560993 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh8vz\" (UniqueName: \"kubernetes.io/projected/849962c6-8103-4d96-8136-23acb6221049-kube-api-access-rh8vz\") pod \"849962c6-8103-4d96-8136-23acb6221049\" (UID: \"849962c6-8103-4d96-8136-23acb6221049\") " Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.561092 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849962c6-8103-4d96-8136-23acb6221049-combined-ca-bundle\") pod \"849962c6-8103-4d96-8136-23acb6221049\" (UID: \"849962c6-8103-4d96-8136-23acb6221049\") " Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.561257 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/849962c6-8103-4d96-8136-23acb6221049-scripts\") pod \"849962c6-8103-4d96-8136-23acb6221049\" (UID: \"849962c6-8103-4d96-8136-23acb6221049\") " Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.561332 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/849962c6-8103-4d96-8136-23acb6221049-config-data\") pod \"849962c6-8103-4d96-8136-23acb6221049\" (UID: \"849962c6-8103-4d96-8136-23acb6221049\") " Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.577228 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/849962c6-8103-4d96-8136-23acb6221049-scripts" (OuterVolumeSpecName: "scripts") pod "849962c6-8103-4d96-8136-23acb6221049" (UID: "849962c6-8103-4d96-8136-23acb6221049"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.580268 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849962c6-8103-4d96-8136-23acb6221049-kube-api-access-rh8vz" (OuterVolumeSpecName: "kube-api-access-rh8vz") pod "849962c6-8103-4d96-8136-23acb6221049" (UID: "849962c6-8103-4d96-8136-23acb6221049"). InnerVolumeSpecName "kube-api-access-rh8vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.609196 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/849962c6-8103-4d96-8136-23acb6221049-config-data" (OuterVolumeSpecName: "config-data") pod "849962c6-8103-4d96-8136-23acb6221049" (UID: "849962c6-8103-4d96-8136-23acb6221049"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.610389 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/849962c6-8103-4d96-8136-23acb6221049-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "849962c6-8103-4d96-8136-23acb6221049" (UID: "849962c6-8103-4d96-8136-23acb6221049"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.663508 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849962c6-8103-4d96-8136-23acb6221049-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.663558 4757 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/849962c6-8103-4d96-8136-23acb6221049-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.663569 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/849962c6-8103-4d96-8136-23acb6221049-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:52 crc kubenswrapper[4757]: I1216 13:11:52.663586 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh8vz\" (UniqueName: \"kubernetes.io/projected/849962c6-8103-4d96-8136-23acb6221049-kube-api-access-rh8vz\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:53 crc kubenswrapper[4757]: I1216 13:11:53.038415 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vkdzh" event={"ID":"849962c6-8103-4d96-8136-23acb6221049","Type":"ContainerDied","Data":"a9bc24964f179fe4feb48838055a3a3fbc1066c57a869b94dc8345c7c5059265"} Dec 16 13:11:53 crc kubenswrapper[4757]: I1216 13:11:53.038801 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9bc24964f179fe4feb48838055a3a3fbc1066c57a869b94dc8345c7c5059265" Dec 16 13:11:53 crc kubenswrapper[4757]: I1216 13:11:53.038901 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vkdzh" Dec 16 13:11:53 crc kubenswrapper[4757]: I1216 13:11:53.042556 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 13:11:53 crc kubenswrapper[4757]: I1216 13:11:53.229661 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 13:11:53 crc kubenswrapper[4757]: I1216 13:11:53.229927 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="577dca3f-5fa8-429a-a3fd-5913f0acd836" containerName="nova-api-log" containerID="cri-o://de39c9dae387c9a7a0a0337645a1f8fda98b3c76c812fc6e19ebb15643aca6d7" gracePeriod=30 Dec 16 13:11:53 crc kubenswrapper[4757]: I1216 13:11:53.230026 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="577dca3f-5fa8-429a-a3fd-5913f0acd836" containerName="nova-api-api" containerID="cri-o://ab24c6ca0319089d7a90e87725aa3faf795b768c3f535f3c5628e6e3b7002fca" gracePeriod=30 Dec 16 13:11:53 crc kubenswrapper[4757]: I1216 13:11:53.258568 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 13:11:53 crc kubenswrapper[4757]: I1216 13:11:53.258845 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="edc9f1f6-2c76-4a89-bcfc-24d45502027a" containerName="nova-scheduler-scheduler" containerID="cri-o://a917624397914497299cbdcd6f524c88b345584c50c71a71ec06f07930b6a0d3" gracePeriod=30 Dec 16 13:11:53 crc kubenswrapper[4757]: I1216 13:11:53.276952 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:11:53 crc kubenswrapper[4757]: I1216 13:11:53.277356 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="34ac3c8a-f8cf-404e-93ed-8c422df4bebf" containerName="nova-metadata-log" containerID="cri-o://5355b484c84eb930d16149103d41f17c6a59d839b97b3ce015fbf779f794595a" gracePeriod=30 Dec 16 13:11:53 crc kubenswrapper[4757]: I1216 13:11:53.277783 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="34ac3c8a-f8cf-404e-93ed-8c422df4bebf" containerName="nova-metadata-metadata" containerID="cri-o://690f26d2d51b88d356f69e48ac1a90ec32376097bd7392ec73c32166efe92b52" gracePeriod=30 Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.056448 4757 generic.go:334] "Generic (PLEG): container finished" podID="577dca3f-5fa8-429a-a3fd-5913f0acd836" containerID="ab24c6ca0319089d7a90e87725aa3faf795b768c3f535f3c5628e6e3b7002fca" exitCode=0 Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.056695 4757 generic.go:334] "Generic (PLEG): container finished" podID="577dca3f-5fa8-429a-a3fd-5913f0acd836" containerID="de39c9dae387c9a7a0a0337645a1f8fda98b3c76c812fc6e19ebb15643aca6d7" exitCode=143 Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.056497 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"577dca3f-5fa8-429a-a3fd-5913f0acd836","Type":"ContainerDied","Data":"ab24c6ca0319089d7a90e87725aa3faf795b768c3f535f3c5628e6e3b7002fca"} Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.056755 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"577dca3f-5fa8-429a-a3fd-5913f0acd836","Type":"ContainerDied","Data":"de39c9dae387c9a7a0a0337645a1f8fda98b3c76c812fc6e19ebb15643aca6d7"} Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.056767 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"577dca3f-5fa8-429a-a3fd-5913f0acd836","Type":"ContainerDied","Data":"8152bd5bd921449c848ac98981055688e6e4f227244588deffb1baceb77b1c11"} Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.056776 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8152bd5bd921449c848ac98981055688e6e4f227244588deffb1baceb77b1c11" Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.060208 4757 generic.go:334] "Generic (PLEG): container finished" podID="34ac3c8a-f8cf-404e-93ed-8c422df4bebf" containerID="5355b484c84eb930d16149103d41f17c6a59d839b97b3ce015fbf779f794595a" exitCode=143 Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.061152 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"34ac3c8a-f8cf-404e-93ed-8c422df4bebf","Type":"ContainerDied","Data":"5355b484c84eb930d16149103d41f17c6a59d839b97b3ce015fbf779f794595a"} Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.143535 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.296913 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/577dca3f-5fa8-429a-a3fd-5913f0acd836-logs\") pod \"577dca3f-5fa8-429a-a3fd-5913f0acd836\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.297150 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvqmt\" (UniqueName: \"kubernetes.io/projected/577dca3f-5fa8-429a-a3fd-5913f0acd836-kube-api-access-xvqmt\") pod \"577dca3f-5fa8-429a-a3fd-5913f0acd836\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.297302 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-config-data\") pod \"577dca3f-5fa8-429a-a3fd-5913f0acd836\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.297325 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/577dca3f-5fa8-429a-a3fd-5913f0acd836-logs" (OuterVolumeSpecName: "logs") pod "577dca3f-5fa8-429a-a3fd-5913f0acd836" (UID: "577dca3f-5fa8-429a-a3fd-5913f0acd836"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.297369 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-public-tls-certs\") pod \"577dca3f-5fa8-429a-a3fd-5913f0acd836\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.297426 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-combined-ca-bundle\") pod \"577dca3f-5fa8-429a-a3fd-5913f0acd836\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.297518 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-internal-tls-certs\") pod \"577dca3f-5fa8-429a-a3fd-5913f0acd836\" (UID: \"577dca3f-5fa8-429a-a3fd-5913f0acd836\") " Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.298127 4757 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/577dca3f-5fa8-429a-a3fd-5913f0acd836-logs\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.317943 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/577dca3f-5fa8-429a-a3fd-5913f0acd836-kube-api-access-xvqmt" (OuterVolumeSpecName: "kube-api-access-xvqmt") pod "577dca3f-5fa8-429a-a3fd-5913f0acd836" (UID: "577dca3f-5fa8-429a-a3fd-5913f0acd836"). InnerVolumeSpecName "kube-api-access-xvqmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.338305 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "577dca3f-5fa8-429a-a3fd-5913f0acd836" (UID: "577dca3f-5fa8-429a-a3fd-5913f0acd836"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.358246 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-config-data" (OuterVolumeSpecName: "config-data") pod "577dca3f-5fa8-429a-a3fd-5913f0acd836" (UID: "577dca3f-5fa8-429a-a3fd-5913f0acd836"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.380267 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "577dca3f-5fa8-429a-a3fd-5913f0acd836" (UID: "577dca3f-5fa8-429a-a3fd-5913f0acd836"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.386096 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "577dca3f-5fa8-429a-a3fd-5913f0acd836" (UID: "577dca3f-5fa8-429a-a3fd-5913f0acd836"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.400682 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvqmt\" (UniqueName: \"kubernetes.io/projected/577dca3f-5fa8-429a-a3fd-5913f0acd836-kube-api-access-xvqmt\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.400712 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.400721 4757 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.400730 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:54 crc kubenswrapper[4757]: I1216 13:11:54.400739 4757 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/577dca3f-5fa8-429a-a3fd-5913f0acd836-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.068821 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.091627 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.102141 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.127700 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 13:11:55 crc kubenswrapper[4757]: E1216 13:11:55.128193 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577dca3f-5fa8-429a-a3fd-5913f0acd836" containerName="nova-api-log" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.128219 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="577dca3f-5fa8-429a-a3fd-5913f0acd836" containerName="nova-api-log" Dec 16 13:11:55 crc kubenswrapper[4757]: E1216 13:11:55.128256 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff" containerName="dnsmasq-dns" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.128265 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff" containerName="dnsmasq-dns" Dec 16 13:11:55 crc kubenswrapper[4757]: E1216 13:11:55.128281 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577dca3f-5fa8-429a-a3fd-5913f0acd836" containerName="nova-api-api" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.128289 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="577dca3f-5fa8-429a-a3fd-5913f0acd836" containerName="nova-api-api" Dec 16 13:11:55 crc kubenswrapper[4757]: E1216 13:11:55.128309 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff" containerName="init" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.128316 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff" containerName="init" Dec 16 13:11:55 crc kubenswrapper[4757]: E1216 13:11:55.128327 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849962c6-8103-4d96-8136-23acb6221049" containerName="nova-manage" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.128334 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="849962c6-8103-4d96-8136-23acb6221049" containerName="nova-manage" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.128563 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="849962c6-8103-4d96-8136-23acb6221049" containerName="nova-manage" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.128579 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="577dca3f-5fa8-429a-a3fd-5913f0acd836" containerName="nova-api-log" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.128591 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="577dca3f-5fa8-429a-a3fd-5913f0acd836" containerName="nova-api-api" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.128606 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="5361789a-bbaa-4f8f-a6c1-c1bda3e8cfff" containerName="dnsmasq-dns" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.129845 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.131894 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.132091 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.134821 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.138745 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 13:11:55 crc kubenswrapper[4757]: E1216 13:11:55.207437 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a917624397914497299cbdcd6f524c88b345584c50c71a71ec06f07930b6a0d3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 13:11:55 crc kubenswrapper[4757]: E1216 13:11:55.213872 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a917624397914497299cbdcd6f524c88b345584c50c71a71ec06f07930b6a0d3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.214972 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xsmd\" (UniqueName: \"kubernetes.io/projected/6aaa83ef-e285-41a7-93c0-853ecd275115-kube-api-access-9xsmd\") pod \"nova-api-0\" (UID: \"6aaa83ef-e285-41a7-93c0-853ecd275115\") " pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.215089 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6aaa83ef-e285-41a7-93c0-853ecd275115-logs\") pod \"nova-api-0\" (UID: \"6aaa83ef-e285-41a7-93c0-853ecd275115\") " pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.215152 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aaa83ef-e285-41a7-93c0-853ecd275115-config-data\") pod \"nova-api-0\" (UID: \"6aaa83ef-e285-41a7-93c0-853ecd275115\") " pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.215178 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6aaa83ef-e285-41a7-93c0-853ecd275115-public-tls-certs\") pod \"nova-api-0\" (UID: \"6aaa83ef-e285-41a7-93c0-853ecd275115\") " pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.215201 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6aaa83ef-e285-41a7-93c0-853ecd275115-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6aaa83ef-e285-41a7-93c0-853ecd275115\") " pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.215345 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aaa83ef-e285-41a7-93c0-853ecd275115-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6aaa83ef-e285-41a7-93c0-853ecd275115\") " pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: E1216 13:11:55.217144 4757 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a917624397914497299cbdcd6f524c88b345584c50c71a71ec06f07930b6a0d3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 13:11:55 crc kubenswrapper[4757]: E1216 13:11:55.217198 4757 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="edc9f1f6-2c76-4a89-bcfc-24d45502027a" containerName="nova-scheduler-scheduler" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.316745 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xsmd\" (UniqueName: \"kubernetes.io/projected/6aaa83ef-e285-41a7-93c0-853ecd275115-kube-api-access-9xsmd\") pod \"nova-api-0\" (UID: \"6aaa83ef-e285-41a7-93c0-853ecd275115\") " pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.316842 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6aaa83ef-e285-41a7-93c0-853ecd275115-logs\") pod \"nova-api-0\" (UID: \"6aaa83ef-e285-41a7-93c0-853ecd275115\") " pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.316912 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aaa83ef-e285-41a7-93c0-853ecd275115-config-data\") pod \"nova-api-0\" (UID: \"6aaa83ef-e285-41a7-93c0-853ecd275115\") " pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.316939 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6aaa83ef-e285-41a7-93c0-853ecd275115-public-tls-certs\") pod \"nova-api-0\" (UID: \"6aaa83ef-e285-41a7-93c0-853ecd275115\") " pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.316962 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6aaa83ef-e285-41a7-93c0-853ecd275115-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6aaa83ef-e285-41a7-93c0-853ecd275115\") " pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.317056 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aaa83ef-e285-41a7-93c0-853ecd275115-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6aaa83ef-e285-41a7-93c0-853ecd275115\") " pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.318134 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6aaa83ef-e285-41a7-93c0-853ecd275115-logs\") pod \"nova-api-0\" (UID: \"6aaa83ef-e285-41a7-93c0-853ecd275115\") " pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.322053 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6aaa83ef-e285-41a7-93c0-853ecd275115-public-tls-certs\") pod \"nova-api-0\" (UID: \"6aaa83ef-e285-41a7-93c0-853ecd275115\") " pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.322523 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aaa83ef-e285-41a7-93c0-853ecd275115-config-data\") pod \"nova-api-0\" (UID: \"6aaa83ef-e285-41a7-93c0-853ecd275115\") " pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.323526 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aaa83ef-e285-41a7-93c0-853ecd275115-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6aaa83ef-e285-41a7-93c0-853ecd275115\") " pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.323706 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6aaa83ef-e285-41a7-93c0-853ecd275115-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6aaa83ef-e285-41a7-93c0-853ecd275115\") " pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.336040 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xsmd\" (UniqueName: \"kubernetes.io/projected/6aaa83ef-e285-41a7-93c0-853ecd275115-kube-api-access-9xsmd\") pod \"nova-api-0\" (UID: \"6aaa83ef-e285-41a7-93c0-853ecd275115\") " pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.447113 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 13:11:55 crc kubenswrapper[4757]: I1216 13:11:55.992453 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 13:11:56 crc kubenswrapper[4757]: I1216 13:11:56.084072 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6aaa83ef-e285-41a7-93c0-853ecd275115","Type":"ContainerStarted","Data":"e404fb6d608b20c83ca9a13ddbcbd555e811eecf96d1c5ea0d22a5bc3cf957fa"} Dec 16 13:11:56 crc kubenswrapper[4757]: I1216 13:11:56.091813 4757 generic.go:334] "Generic (PLEG): container finished" podID="edc9f1f6-2c76-4a89-bcfc-24d45502027a" containerID="a917624397914497299cbdcd6f524c88b345584c50c71a71ec06f07930b6a0d3" exitCode=0 Dec 16 13:11:56 crc kubenswrapper[4757]: I1216 13:11:56.091855 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"edc9f1f6-2c76-4a89-bcfc-24d45502027a","Type":"ContainerDied","Data":"a917624397914497299cbdcd6f524c88b345584c50c71a71ec06f07930b6a0d3"} Dec 16 13:11:56 crc kubenswrapper[4757]: I1216 13:11:56.285047 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 13:11:56 crc kubenswrapper[4757]: I1216 13:11:56.438924 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc9f1f6-2c76-4a89-bcfc-24d45502027a-config-data\") pod \"edc9f1f6-2c76-4a89-bcfc-24d45502027a\" (UID: \"edc9f1f6-2c76-4a89-bcfc-24d45502027a\") " Dec 16 13:11:56 crc kubenswrapper[4757]: I1216 13:11:56.439151 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n5m\" (UniqueName: \"kubernetes.io/projected/edc9f1f6-2c76-4a89-bcfc-24d45502027a-kube-api-access-s4n5m\") pod \"edc9f1f6-2c76-4a89-bcfc-24d45502027a\" (UID: \"edc9f1f6-2c76-4a89-bcfc-24d45502027a\") " Dec 16 13:11:56 crc kubenswrapper[4757]: I1216 13:11:56.439322 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc9f1f6-2c76-4a89-bcfc-24d45502027a-combined-ca-bundle\") pod \"edc9f1f6-2c76-4a89-bcfc-24d45502027a\" (UID: \"edc9f1f6-2c76-4a89-bcfc-24d45502027a\") " Dec 16 13:11:56 crc kubenswrapper[4757]: I1216 13:11:56.449521 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edc9f1f6-2c76-4a89-bcfc-24d45502027a-kube-api-access-s4n5m" (OuterVolumeSpecName: "kube-api-access-s4n5m") pod "edc9f1f6-2c76-4a89-bcfc-24d45502027a" (UID: "edc9f1f6-2c76-4a89-bcfc-24d45502027a"). InnerVolumeSpecName "kube-api-access-s4n5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:56 crc kubenswrapper[4757]: I1216 13:11:56.483829 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc9f1f6-2c76-4a89-bcfc-24d45502027a-config-data" (OuterVolumeSpecName: "config-data") pod "edc9f1f6-2c76-4a89-bcfc-24d45502027a" (UID: "edc9f1f6-2c76-4a89-bcfc-24d45502027a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:56 crc kubenswrapper[4757]: I1216 13:11:56.484461 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc9f1f6-2c76-4a89-bcfc-24d45502027a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edc9f1f6-2c76-4a89-bcfc-24d45502027a" (UID: "edc9f1f6-2c76-4a89-bcfc-24d45502027a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:56 crc kubenswrapper[4757]: I1216 13:11:56.542624 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n5m\" (UniqueName: \"kubernetes.io/projected/edc9f1f6-2c76-4a89-bcfc-24d45502027a-kube-api-access-s4n5m\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:56 crc kubenswrapper[4757]: I1216 13:11:56.542935 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc9f1f6-2c76-4a89-bcfc-24d45502027a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:56 crc kubenswrapper[4757]: I1216 13:11:56.542947 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc9f1f6-2c76-4a89-bcfc-24d45502027a-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:56 crc kubenswrapper[4757]: I1216 13:11:56.713691 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="34ac3c8a-f8cf-404e-93ed-8c422df4bebf" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:42816->10.217.0.197:8775: read: connection reset by peer" Dec 16 13:11:56 crc kubenswrapper[4757]: I1216 13:11:56.713706 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="34ac3c8a-f8cf-404e-93ed-8c422df4bebf" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:42804->10.217.0.197:8775: read: connection reset by peer" Dec 16 13:11:56 crc kubenswrapper[4757]: I1216 13:11:56.960508 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="577dca3f-5fa8-429a-a3fd-5913f0acd836" path="/var/lib/kubelet/pods/577dca3f-5fa8-429a-a3fd-5913f0acd836/volumes" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.102229 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.102223 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"edc9f1f6-2c76-4a89-bcfc-24d45502027a","Type":"ContainerDied","Data":"39310a2e6c37b72fe30d7feca6d857303905aa41a11eb9f51d06c6bf4aea4049"} Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.102541 4757 scope.go:117] "RemoveContainer" containerID="a917624397914497299cbdcd6f524c88b345584c50c71a71ec06f07930b6a0d3" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.107852 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6aaa83ef-e285-41a7-93c0-853ecd275115","Type":"ContainerStarted","Data":"769d4d4063ebc38221f3e60c4399b22eed8a0f1c4c1739a2846da9bd80e5e929"} Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.107891 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6aaa83ef-e285-41a7-93c0-853ecd275115","Type":"ContainerStarted","Data":"19986d10d6449aa0ff306e94ccd25822124cbf2110af56b9fc83fb2f7c98f192"} Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.122772 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"34ac3c8a-f8cf-404e-93ed-8c422df4bebf","Type":"ContainerDied","Data":"690f26d2d51b88d356f69e48ac1a90ec32376097bd7392ec73c32166efe92b52"} Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.122629 4757 generic.go:334] "Generic (PLEG): container finished" podID="34ac3c8a-f8cf-404e-93ed-8c422df4bebf" containerID="690f26d2d51b88d356f69e48ac1a90ec32376097bd7392ec73c32166efe92b52" exitCode=0 Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.148913 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.148870583 podStartE2EDuration="2.148870583s" podCreationTimestamp="2025-12-16 13:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:11:57.142112035 +0000 UTC m=+1502.569855831" watchObservedRunningTime="2025-12-16 13:11:57.148870583 +0000 UTC m=+1502.576614389" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.209134 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.235786 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.245902 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.274536 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 13:11:57 crc kubenswrapper[4757]: E1216 13:11:57.274987 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ac3c8a-f8cf-404e-93ed-8c422df4bebf" containerName="nova-metadata-log" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.275031 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ac3c8a-f8cf-404e-93ed-8c422df4bebf" containerName="nova-metadata-log" Dec 16 13:11:57 crc kubenswrapper[4757]: E1216 13:11:57.275079 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc9f1f6-2c76-4a89-bcfc-24d45502027a" containerName="nova-scheduler-scheduler" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.275088 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc9f1f6-2c76-4a89-bcfc-24d45502027a" containerName="nova-scheduler-scheduler" Dec 16 13:11:57 crc kubenswrapper[4757]: E1216 13:11:57.275109 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ac3c8a-f8cf-404e-93ed-8c422df4bebf" containerName="nova-metadata-metadata" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.275117 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ac3c8a-f8cf-404e-93ed-8c422df4bebf" containerName="nova-metadata-metadata" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.275319 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="edc9f1f6-2c76-4a89-bcfc-24d45502027a" containerName="nova-scheduler-scheduler" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.275339 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ac3c8a-f8cf-404e-93ed-8c422df4bebf" containerName="nova-metadata-metadata" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.275358 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ac3c8a-f8cf-404e-93ed-8c422df4bebf" containerName="nova-metadata-log" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.282928 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.286636 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.313517 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.357657 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-config-data\") pod \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.357927 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-combined-ca-bundle\") pod \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.358030 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-nova-metadata-tls-certs\") pod \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.358162 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fm8s\" (UniqueName: \"kubernetes.io/projected/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-kube-api-access-4fm8s\") pod \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.358270 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-logs\") pod \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\" (UID: \"34ac3c8a-f8cf-404e-93ed-8c422df4bebf\") " Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.358668 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc85d441-d05f-4495-a380-1a5ed58ad631-config-data\") pod \"nova-scheduler-0\" (UID: \"fc85d441-d05f-4495-a380-1a5ed58ad631\") " pod="openstack/nova-scheduler-0" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.358748 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-logs" (OuterVolumeSpecName: "logs") pod "34ac3c8a-f8cf-404e-93ed-8c422df4bebf" (UID: "34ac3c8a-f8cf-404e-93ed-8c422df4bebf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.358886 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-867xv\" (UniqueName: \"kubernetes.io/projected/fc85d441-d05f-4495-a380-1a5ed58ad631-kube-api-access-867xv\") pod \"nova-scheduler-0\" (UID: \"fc85d441-d05f-4495-a380-1a5ed58ad631\") " pod="openstack/nova-scheduler-0" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.358977 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc85d441-d05f-4495-a380-1a5ed58ad631-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fc85d441-d05f-4495-a380-1a5ed58ad631\") " pod="openstack/nova-scheduler-0" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.359103 4757 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-logs\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.371212 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-kube-api-access-4fm8s" (OuterVolumeSpecName: "kube-api-access-4fm8s") pod "34ac3c8a-f8cf-404e-93ed-8c422df4bebf" (UID: "34ac3c8a-f8cf-404e-93ed-8c422df4bebf"). InnerVolumeSpecName "kube-api-access-4fm8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.421994 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34ac3c8a-f8cf-404e-93ed-8c422df4bebf" (UID: "34ac3c8a-f8cf-404e-93ed-8c422df4bebf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.426311 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-config-data" (OuterVolumeSpecName: "config-data") pod "34ac3c8a-f8cf-404e-93ed-8c422df4bebf" (UID: "34ac3c8a-f8cf-404e-93ed-8c422df4bebf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.461426 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-867xv\" (UniqueName: \"kubernetes.io/projected/fc85d441-d05f-4495-a380-1a5ed58ad631-kube-api-access-867xv\") pod \"nova-scheduler-0\" (UID: \"fc85d441-d05f-4495-a380-1a5ed58ad631\") " pod="openstack/nova-scheduler-0" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.461492 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc85d441-d05f-4495-a380-1a5ed58ad631-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fc85d441-d05f-4495-a380-1a5ed58ad631\") " pod="openstack/nova-scheduler-0" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.461670 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc85d441-d05f-4495-a380-1a5ed58ad631-config-data\") pod \"nova-scheduler-0\" (UID: \"fc85d441-d05f-4495-a380-1a5ed58ad631\") " pod="openstack/nova-scheduler-0" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.461793 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.461806 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.461822 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fm8s\" (UniqueName: \"kubernetes.io/projected/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-kube-api-access-4fm8s\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.469698 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc85d441-d05f-4495-a380-1a5ed58ad631-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fc85d441-d05f-4495-a380-1a5ed58ad631\") " pod="openstack/nova-scheduler-0" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.474857 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc85d441-d05f-4495-a380-1a5ed58ad631-config-data\") pod \"nova-scheduler-0\" (UID: \"fc85d441-d05f-4495-a380-1a5ed58ad631\") " pod="openstack/nova-scheduler-0" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.500612 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-867xv\" (UniqueName: \"kubernetes.io/projected/fc85d441-d05f-4495-a380-1a5ed58ad631-kube-api-access-867xv\") pod \"nova-scheduler-0\" (UID: \"fc85d441-d05f-4495-a380-1a5ed58ad631\") " pod="openstack/nova-scheduler-0" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.531200 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "34ac3c8a-f8cf-404e-93ed-8c422df4bebf" (UID: "34ac3c8a-f8cf-404e-93ed-8c422df4bebf"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.563934 4757 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ac3c8a-f8cf-404e-93ed-8c422df4bebf-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 13:11:57 crc kubenswrapper[4757]: I1216 13:11:57.602867 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.116202 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 13:11:58 crc kubenswrapper[4757]: W1216 13:11:58.117709 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc85d441_d05f_4495_a380_1a5ed58ad631.slice/crio-828a52c914634cf43684139ef1cfd25e17d33efe32d01686feaa123121c2b678 WatchSource:0}: Error finding container 828a52c914634cf43684139ef1cfd25e17d33efe32d01686feaa123121c2b678: Status 404 returned error can't find the container with id 828a52c914634cf43684139ef1cfd25e17d33efe32d01686feaa123121c2b678 Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.138305 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"34ac3c8a-f8cf-404e-93ed-8c422df4bebf","Type":"ContainerDied","Data":"60ab2d04e2feb9604c7c4db879aaeae2a4fbabaea63fa95b19c72985439b207b"} Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.138359 4757 scope.go:117] "RemoveContainer" containerID="690f26d2d51b88d356f69e48ac1a90ec32376097bd7392ec73c32166efe92b52" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.138382 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.140577 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fc85d441-d05f-4495-a380-1a5ed58ad631","Type":"ContainerStarted","Data":"828a52c914634cf43684139ef1cfd25e17d33efe32d01686feaa123121c2b678"} Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.194984 4757 scope.go:117] "RemoveContainer" containerID="5355b484c84eb930d16149103d41f17c6a59d839b97b3ce015fbf779f794595a" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.205481 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.228254 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.260315 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.262279 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.264393 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.269305 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.273344 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.384076 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877-config-data\") pod \"nova-metadata-0\" (UID: \"0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877\") " pod="openstack/nova-metadata-0" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.384137 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5g5m\" (UniqueName: \"kubernetes.io/projected/0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877-kube-api-access-b5g5m\") pod \"nova-metadata-0\" (UID: \"0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877\") " pod="openstack/nova-metadata-0" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.384159 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877-logs\") pod \"nova-metadata-0\" (UID: \"0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877\") " pod="openstack/nova-metadata-0" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.384179 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877\") " pod="openstack/nova-metadata-0" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.384202 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877\") " pod="openstack/nova-metadata-0" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.486394 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877-config-data\") pod \"nova-metadata-0\" (UID: \"0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877\") " pod="openstack/nova-metadata-0" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.486764 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5g5m\" (UniqueName: \"kubernetes.io/projected/0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877-kube-api-access-b5g5m\") pod \"nova-metadata-0\" (UID: \"0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877\") " pod="openstack/nova-metadata-0" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.486814 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877-logs\") pod \"nova-metadata-0\" (UID: \"0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877\") " pod="openstack/nova-metadata-0" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.486840 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877\") " pod="openstack/nova-metadata-0" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.486865 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877\") " pod="openstack/nova-metadata-0" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.487513 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877-logs\") pod \"nova-metadata-0\" (UID: \"0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877\") " pod="openstack/nova-metadata-0" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.491992 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877-config-data\") pod \"nova-metadata-0\" (UID: \"0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877\") " pod="openstack/nova-metadata-0" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.493746 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877\") " pod="openstack/nova-metadata-0" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.493856 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877\") " pod="openstack/nova-metadata-0" Dec 16 13:11:58 crc kubenswrapper[4757]: I1216 13:11:58.504101 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5g5m\" (UniqueName: \"kubernetes.io/projected/0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877-kube-api-access-b5g5m\") pod \"nova-metadata-0\" (UID: \"0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877\") " pod="openstack/nova-metadata-0" Dec 16 13:11:59 crc kubenswrapper[4757]: I1216 13:11:59.069536 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 13:11:59 crc kubenswrapper[4757]: I1216 13:11:59.092443 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34ac3c8a-f8cf-404e-93ed-8c422df4bebf" path="/var/lib/kubelet/pods/34ac3c8a-f8cf-404e-93ed-8c422df4bebf/volumes" Dec 16 13:11:59 crc kubenswrapper[4757]: I1216 13:11:59.095294 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edc9f1f6-2c76-4a89-bcfc-24d45502027a" path="/var/lib/kubelet/pods/edc9f1f6-2c76-4a89-bcfc-24d45502027a/volumes" Dec 16 13:11:59 crc kubenswrapper[4757]: I1216 13:11:59.234151 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fc85d441-d05f-4495-a380-1a5ed58ad631","Type":"ContainerStarted","Data":"684f4700682bb292cd48dd33958e1e5496260886d4908f3ec9912b07f2d5f99f"} Dec 16 13:12:00 crc kubenswrapper[4757]: I1216 13:11:59.712758 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 13:12:00 crc kubenswrapper[4757]: I1216 13:12:00.244357 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877","Type":"ContainerStarted","Data":"68a13e9b6fc919f74dcd0970c1ef76a1ac87200ea409fc4a5f3e51010b6c60bf"} Dec 16 13:12:00 crc kubenswrapper[4757]: I1216 13:12:00.267274 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.267253759 podStartE2EDuration="3.267253759s" podCreationTimestamp="2025-12-16 13:11:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:12:00.258741788 +0000 UTC m=+1505.686485604" watchObservedRunningTime="2025-12-16 13:12:00.267253759 +0000 UTC m=+1505.694997555" Dec 16 13:12:01 crc kubenswrapper[4757]: I1216 13:12:01.256254 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877","Type":"ContainerStarted","Data":"c0d2e4a53f961df52b2995932307e9f0566f0cd0b6d43c58b8a8e8ad9ddf930c"} Dec 16 13:12:01 crc kubenswrapper[4757]: I1216 13:12:01.256568 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877","Type":"ContainerStarted","Data":"b3132571d1b00fb8709192571f4c30ab0641eceb2b59f1e0b5bfbe340bb69c1f"} Dec 16 13:12:01 crc kubenswrapper[4757]: I1216 13:12:01.283141 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.283118393 podStartE2EDuration="3.283118393s" podCreationTimestamp="2025-12-16 13:11:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:12:01.274741966 +0000 UTC m=+1506.702485782" watchObservedRunningTime="2025-12-16 13:12:01.283118393 +0000 UTC m=+1506.710862199" Dec 16 13:12:02 crc kubenswrapper[4757]: I1216 13:12:02.603606 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 13:12:04 crc kubenswrapper[4757]: I1216 13:12:04.071330 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 13:12:04 crc kubenswrapper[4757]: I1216 13:12:04.071749 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 13:12:05 crc kubenswrapper[4757]: I1216 13:12:05.448104 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 13:12:05 crc kubenswrapper[4757]: I1216 13:12:05.448461 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 13:12:05 crc kubenswrapper[4757]: I1216 13:12:05.949593 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:12:05 crc kubenswrapper[4757]: E1216 13:12:05.949826 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:12:06 crc kubenswrapper[4757]: I1216 13:12:06.462281 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6aaa83ef-e285-41a7-93c0-853ecd275115" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 13:12:06 crc kubenswrapper[4757]: I1216 13:12:06.462325 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6aaa83ef-e285-41a7-93c0-853ecd275115" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 13:12:07 crc kubenswrapper[4757]: I1216 13:12:07.604348 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 13:12:07 crc kubenswrapper[4757]: I1216 13:12:07.632433 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 13:12:08 crc kubenswrapper[4757]: I1216 13:12:08.371086 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 13:12:09 crc kubenswrapper[4757]: I1216 13:12:09.070700 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 13:12:09 crc kubenswrapper[4757]: I1216 13:12:09.070749 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 13:12:10 crc kubenswrapper[4757]: I1216 13:12:10.084142 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 13:12:10 crc kubenswrapper[4757]: I1216 13:12:10.084159 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 13:12:15 crc kubenswrapper[4757]: I1216 13:12:15.457748 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 13:12:15 crc kubenswrapper[4757]: I1216 13:12:15.459584 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 13:12:15 crc kubenswrapper[4757]: I1216 13:12:15.460974 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 13:12:15 crc kubenswrapper[4757]: I1216 13:12:15.473905 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 13:12:16 crc kubenswrapper[4757]: I1216 13:12:16.419507 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 13:12:16 crc kubenswrapper[4757]: I1216 13:12:16.443473 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 13:12:16 crc kubenswrapper[4757]: I1216 13:12:16.573472 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 16 13:12:17 crc kubenswrapper[4757]: I1216 13:12:17.951590 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:12:17 crc kubenswrapper[4757]: E1216 13:12:17.951989 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:12:19 crc kubenswrapper[4757]: I1216 13:12:19.075465 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 13:12:19 crc kubenswrapper[4757]: I1216 13:12:19.080457 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 13:12:19 crc kubenswrapper[4757]: I1216 13:12:19.081871 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 13:12:19 crc kubenswrapper[4757]: I1216 13:12:19.447853 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 13:12:27 crc kubenswrapper[4757]: I1216 13:12:27.362341 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 13:12:28 crc kubenswrapper[4757]: I1216 13:12:28.030121 4757 scope.go:117] "RemoveContainer" containerID="f5fd308ddb7510ee0e4c3b4afb280537f0a556bd7462ab07c7b6cade8869500e" Dec 16 13:12:28 crc kubenswrapper[4757]: I1216 13:12:28.061566 4757 scope.go:117] "RemoveContainer" containerID="d00c3d0699f3db5dc28702bbad2957f404f37709d9dd615c72ec6502ad88e6e0" Dec 16 13:12:28 crc kubenswrapper[4757]: I1216 13:12:28.372048 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 13:12:30 crc kubenswrapper[4757]: I1216 13:12:30.948821 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:12:30 crc kubenswrapper[4757]: E1216 13:12:30.949393 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:12:32 crc kubenswrapper[4757]: I1216 13:12:32.342848 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="d0dd86b6-b617-44fa-aabc-f073e1df12ca" containerName="rabbitmq" containerID="cri-o://390e2ade2cdd0e741440d82f6842104f35b406a950aaada35ce1e48c36b7c0e7" gracePeriod=604796 Dec 16 13:12:32 crc kubenswrapper[4757]: I1216 13:12:32.944672 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="38824624-9325-4515-ab97-157001f60385" containerName="rabbitmq" containerID="cri-o://0aa29fce5aa134e8e87689184651a5c33124e6e03eb9b1f6a548b2059a8fdd88" gracePeriod=604796 Dec 16 13:12:33 crc kubenswrapper[4757]: I1216 13:12:33.626875 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-krb75"] Dec 16 13:12:33 crc kubenswrapper[4757]: I1216 13:12:33.629663 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-krb75" Dec 16 13:12:33 crc kubenswrapper[4757]: I1216 13:12:33.640047 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-krb75"] Dec 16 13:12:33 crc kubenswrapper[4757]: I1216 13:12:33.765614 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c200109e-9fd1-4a69-ae82-5a0e29c09ccf-catalog-content\") pod \"community-operators-krb75\" (UID: \"c200109e-9fd1-4a69-ae82-5a0e29c09ccf\") " pod="openshift-marketplace/community-operators-krb75" Dec 16 13:12:33 crc kubenswrapper[4757]: I1216 13:12:33.765651 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c200109e-9fd1-4a69-ae82-5a0e29c09ccf-utilities\") pod \"community-operators-krb75\" (UID: \"c200109e-9fd1-4a69-ae82-5a0e29c09ccf\") " pod="openshift-marketplace/community-operators-krb75" Dec 16 13:12:33 crc kubenswrapper[4757]: I1216 13:12:33.765679 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tslw4\" (UniqueName: \"kubernetes.io/projected/c200109e-9fd1-4a69-ae82-5a0e29c09ccf-kube-api-access-tslw4\") pod \"community-operators-krb75\" (UID: \"c200109e-9fd1-4a69-ae82-5a0e29c09ccf\") " pod="openshift-marketplace/community-operators-krb75" Dec 16 13:12:33 crc kubenswrapper[4757]: I1216 13:12:33.867585 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c200109e-9fd1-4a69-ae82-5a0e29c09ccf-catalog-content\") pod \"community-operators-krb75\" (UID: \"c200109e-9fd1-4a69-ae82-5a0e29c09ccf\") " pod="openshift-marketplace/community-operators-krb75" Dec 16 13:12:33 crc kubenswrapper[4757]: I1216 13:12:33.867647 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c200109e-9fd1-4a69-ae82-5a0e29c09ccf-utilities\") pod \"community-operators-krb75\" (UID: \"c200109e-9fd1-4a69-ae82-5a0e29c09ccf\") " pod="openshift-marketplace/community-operators-krb75" Dec 16 13:12:33 crc kubenswrapper[4757]: I1216 13:12:33.867680 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tslw4\" (UniqueName: \"kubernetes.io/projected/c200109e-9fd1-4a69-ae82-5a0e29c09ccf-kube-api-access-tslw4\") pod \"community-operators-krb75\" (UID: \"c200109e-9fd1-4a69-ae82-5a0e29c09ccf\") " pod="openshift-marketplace/community-operators-krb75" Dec 16 13:12:33 crc kubenswrapper[4757]: I1216 13:12:33.868159 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c200109e-9fd1-4a69-ae82-5a0e29c09ccf-catalog-content\") pod \"community-operators-krb75\" (UID: \"c200109e-9fd1-4a69-ae82-5a0e29c09ccf\") " pod="openshift-marketplace/community-operators-krb75" Dec 16 13:12:33 crc kubenswrapper[4757]: I1216 13:12:33.868381 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c200109e-9fd1-4a69-ae82-5a0e29c09ccf-utilities\") pod \"community-operators-krb75\" (UID: \"c200109e-9fd1-4a69-ae82-5a0e29c09ccf\") " pod="openshift-marketplace/community-operators-krb75" Dec 16 13:12:33 crc kubenswrapper[4757]: I1216 13:12:33.889040 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tslw4\" (UniqueName: \"kubernetes.io/projected/c200109e-9fd1-4a69-ae82-5a0e29c09ccf-kube-api-access-tslw4\") pod \"community-operators-krb75\" (UID: \"c200109e-9fd1-4a69-ae82-5a0e29c09ccf\") " pod="openshift-marketplace/community-operators-krb75" Dec 16 13:12:33 crc kubenswrapper[4757]: I1216 13:12:33.979729 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-krb75" Dec 16 13:12:34 crc kubenswrapper[4757]: I1216 13:12:34.476775 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-krb75"] Dec 16 13:12:34 crc kubenswrapper[4757]: I1216 13:12:34.568631 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krb75" event={"ID":"c200109e-9fd1-4a69-ae82-5a0e29c09ccf","Type":"ContainerStarted","Data":"2e820ad85211c54bb9e0b1c7764533b9dceafa07b2c68e53cf50bf9284e0cec3"} Dec 16 13:12:35 crc kubenswrapper[4757]: I1216 13:12:35.580697 4757 generic.go:334] "Generic (PLEG): container finished" podID="c200109e-9fd1-4a69-ae82-5a0e29c09ccf" containerID="0b25a6b40e14a17aa1b714473a16c43ea5f6583b08f12af52e79b829ad3b58a5" exitCode=0 Dec 16 13:12:35 crc kubenswrapper[4757]: I1216 13:12:35.580746 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krb75" event={"ID":"c200109e-9fd1-4a69-ae82-5a0e29c09ccf","Type":"ContainerDied","Data":"0b25a6b40e14a17aa1b714473a16c43ea5f6583b08f12af52e79b829ad3b58a5"} Dec 16 13:12:36 crc kubenswrapper[4757]: I1216 13:12:36.591635 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krb75" event={"ID":"c200109e-9fd1-4a69-ae82-5a0e29c09ccf","Type":"ContainerStarted","Data":"93840a5aac7e792f7083a4d5276a9603e8b9679efe3882721ab4219f4d791d36"} Dec 16 13:12:37 crc kubenswrapper[4757]: I1216 13:12:37.278614 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="d0dd86b6-b617-44fa-aabc-f073e1df12ca" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Dec 16 13:12:37 crc kubenswrapper[4757]: I1216 13:12:37.823358 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="38824624-9325-4515-ab97-157001f60385" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 16 13:12:38 crc kubenswrapper[4757]: I1216 13:12:38.611400 4757 generic.go:334] "Generic (PLEG): container finished" podID="c200109e-9fd1-4a69-ae82-5a0e29c09ccf" containerID="93840a5aac7e792f7083a4d5276a9603e8b9679efe3882721ab4219f4d791d36" exitCode=0 Dec 16 13:12:38 crc kubenswrapper[4757]: I1216 13:12:38.611447 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krb75" event={"ID":"c200109e-9fd1-4a69-ae82-5a0e29c09ccf","Type":"ContainerDied","Data":"93840a5aac7e792f7083a4d5276a9603e8b9679efe3882721ab4219f4d791d36"} Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.640929 4757 generic.go:334] "Generic (PLEG): container finished" podID="38824624-9325-4515-ab97-157001f60385" containerID="0aa29fce5aa134e8e87689184651a5c33124e6e03eb9b1f6a548b2059a8fdd88" exitCode=0 Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.641634 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38824624-9325-4515-ab97-157001f60385","Type":"ContainerDied","Data":"0aa29fce5aa134e8e87689184651a5c33124e6e03eb9b1f6a548b2059a8fdd88"} Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.641667 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38824624-9325-4515-ab97-157001f60385","Type":"ContainerDied","Data":"88f607441a18fb63650f60bfb09f38014869e4c2c0c6187fc6d8dffa4de4aa91"} Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.641683 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88f607441a18fb63650f60bfb09f38014869e4c2c0c6187fc6d8dffa4de4aa91" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.652370 4757 generic.go:334] "Generic (PLEG): container finished" podID="d0dd86b6-b617-44fa-aabc-f073e1df12ca" containerID="390e2ade2cdd0e741440d82f6842104f35b406a950aaada35ce1e48c36b7c0e7" exitCode=0 Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.652878 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d0dd86b6-b617-44fa-aabc-f073e1df12ca","Type":"ContainerDied","Data":"390e2ade2cdd0e741440d82f6842104f35b406a950aaada35ce1e48c36b7c0e7"} Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.656828 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krb75" event={"ID":"c200109e-9fd1-4a69-ae82-5a0e29c09ccf","Type":"ContainerStarted","Data":"fd458c717d095f5cb61196b254e1284532a299fc7cca1109492c089ae4da3f0d"} Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.690197 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-krb75" podStartSLOduration=3.266551431 podStartE2EDuration="6.690169471s" podCreationTimestamp="2025-12-16 13:12:33 +0000 UTC" firstStartedPulling="2025-12-16 13:12:35.582711114 +0000 UTC m=+1541.010454910" lastFinishedPulling="2025-12-16 13:12:39.006329154 +0000 UTC m=+1544.434072950" observedRunningTime="2025-12-16 13:12:39.68171304 +0000 UTC m=+1545.109456846" watchObservedRunningTime="2025-12-16 13:12:39.690169471 +0000 UTC m=+1545.117913267" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.718663 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.723295 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.805849 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38824624-9325-4515-ab97-157001f60385-rabbitmq-tls\") pod \"38824624-9325-4515-ab97-157001f60385\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.805942 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-confd\") pod \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.805996 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38824624-9325-4515-ab97-157001f60385-server-conf\") pod \"38824624-9325-4515-ab97-157001f60385\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.806075 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nhrm\" (UniqueName: \"kubernetes.io/projected/d0dd86b6-b617-44fa-aabc-f073e1df12ca-kube-api-access-9nhrm\") pod \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.806146 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-erlang-cookie\") pod \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.806207 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d0dd86b6-b617-44fa-aabc-f073e1df12ca-erlang-cookie-secret\") pod \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.806229 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.806298 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-plugins\") pod \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.806320 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38824624-9325-4515-ab97-157001f60385-rabbitmq-confd\") pod \"38824624-9325-4515-ab97-157001f60385\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.806379 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0dd86b6-b617-44fa-aabc-f073e1df12ca-config-data\") pod \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.806474 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d0dd86b6-b617-44fa-aabc-f073e1df12ca-server-conf\") pod \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.806517 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38824624-9325-4515-ab97-157001f60385-erlang-cookie-secret\") pod \"38824624-9325-4515-ab97-157001f60385\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.806540 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d0dd86b6-b617-44fa-aabc-f073e1df12ca-plugins-conf\") pod \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.806558 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"38824624-9325-4515-ab97-157001f60385\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.806616 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38824624-9325-4515-ab97-157001f60385-plugins-conf\") pod \"38824624-9325-4515-ab97-157001f60385\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.806652 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d0dd86b6-b617-44fa-aabc-f073e1df12ca-pod-info\") pod \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.806703 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38824624-9325-4515-ab97-157001f60385-rabbitmq-plugins\") pod \"38824624-9325-4515-ab97-157001f60385\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.806864 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhkdv\" (UniqueName: \"kubernetes.io/projected/38824624-9325-4515-ab97-157001f60385-kube-api-access-vhkdv\") pod \"38824624-9325-4515-ab97-157001f60385\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.806916 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38824624-9325-4515-ab97-157001f60385-pod-info\") pod \"38824624-9325-4515-ab97-157001f60385\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.806939 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38824624-9325-4515-ab97-157001f60385-rabbitmq-erlang-cookie\") pod \"38824624-9325-4515-ab97-157001f60385\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.806960 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38824624-9325-4515-ab97-157001f60385-config-data\") pod \"38824624-9325-4515-ab97-157001f60385\" (UID: \"38824624-9325-4515-ab97-157001f60385\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.807015 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-tls\") pod \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\" (UID: \"d0dd86b6-b617-44fa-aabc-f073e1df12ca\") " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.810558 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38824624-9325-4515-ab97-157001f60385-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "38824624-9325-4515-ab97-157001f60385" (UID: "38824624-9325-4515-ab97-157001f60385"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.812187 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38824624-9325-4515-ab97-157001f60385-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "38824624-9325-4515-ab97-157001f60385" (UID: "38824624-9325-4515-ab97-157001f60385"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.813344 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38824624-9325-4515-ab97-157001f60385-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "38824624-9325-4515-ab97-157001f60385" (UID: "38824624-9325-4515-ab97-157001f60385"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.831604 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d0dd86b6-b617-44fa-aabc-f073e1df12ca" (UID: "d0dd86b6-b617-44fa-aabc-f073e1df12ca"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.849183 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0dd86b6-b617-44fa-aabc-f073e1df12ca-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d0dd86b6-b617-44fa-aabc-f073e1df12ca" (UID: "d0dd86b6-b617-44fa-aabc-f073e1df12ca"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.857134 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d0dd86b6-b617-44fa-aabc-f073e1df12ca" (UID: "d0dd86b6-b617-44fa-aabc-f073e1df12ca"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.861503 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38824624-9325-4515-ab97-157001f60385-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "38824624-9325-4515-ab97-157001f60385" (UID: "38824624-9325-4515-ab97-157001f60385"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.861584 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "38824624-9325-4515-ab97-157001f60385" (UID: "38824624-9325-4515-ab97-157001f60385"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.877518 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38824624-9325-4515-ab97-157001f60385-kube-api-access-vhkdv" (OuterVolumeSpecName: "kube-api-access-vhkdv") pod "38824624-9325-4515-ab97-157001f60385" (UID: "38824624-9325-4515-ab97-157001f60385"). InnerVolumeSpecName "kube-api-access-vhkdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.877934 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "d0dd86b6-b617-44fa-aabc-f073e1df12ca" (UID: "d0dd86b6-b617-44fa-aabc-f073e1df12ca"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.885620 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38824624-9325-4515-ab97-157001f60385-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "38824624-9325-4515-ab97-157001f60385" (UID: "38824624-9325-4515-ab97-157001f60385"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.887116 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/38824624-9325-4515-ab97-157001f60385-pod-info" (OuterVolumeSpecName: "pod-info") pod "38824624-9325-4515-ab97-157001f60385" (UID: "38824624-9325-4515-ab97-157001f60385"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.915029 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhkdv\" (UniqueName: \"kubernetes.io/projected/38824624-9325-4515-ab97-157001f60385-kube-api-access-vhkdv\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.915070 4757 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38824624-9325-4515-ab97-157001f60385-pod-info\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.915084 4757 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38824624-9325-4515-ab97-157001f60385-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.915096 4757 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38824624-9325-4515-ab97-157001f60385-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.915109 4757 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.915143 4757 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.915154 4757 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.915164 4757 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38824624-9325-4515-ab97-157001f60385-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.915174 4757 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d0dd86b6-b617-44fa-aabc-f073e1df12ca-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.915193 4757 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.915202 4757 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38824624-9325-4515-ab97-157001f60385-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.915210 4757 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38824624-9325-4515-ab97-157001f60385-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.916280 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d0dd86b6-b617-44fa-aabc-f073e1df12ca" (UID: "d0dd86b6-b617-44fa-aabc-f073e1df12ca"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.916493 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0dd86b6-b617-44fa-aabc-f073e1df12ca-kube-api-access-9nhrm" (OuterVolumeSpecName: "kube-api-access-9nhrm") pod "d0dd86b6-b617-44fa-aabc-f073e1df12ca" (UID: "d0dd86b6-b617-44fa-aabc-f073e1df12ca"). InnerVolumeSpecName "kube-api-access-9nhrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.920178 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d0dd86b6-b617-44fa-aabc-f073e1df12ca-pod-info" (OuterVolumeSpecName: "pod-info") pod "d0dd86b6-b617-44fa-aabc-f073e1df12ca" (UID: "d0dd86b6-b617-44fa-aabc-f073e1df12ca"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.957517 4757 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.969383 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0dd86b6-b617-44fa-aabc-f073e1df12ca-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d0dd86b6-b617-44fa-aabc-f073e1df12ca" (UID: "d0dd86b6-b617-44fa-aabc-f073e1df12ca"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:12:39 crc kubenswrapper[4757]: I1216 13:12:39.993231 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38824624-9325-4515-ab97-157001f60385-config-data" (OuterVolumeSpecName: "config-data") pod "38824624-9325-4515-ab97-157001f60385" (UID: "38824624-9325-4515-ab97-157001f60385"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.043158 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38824624-9325-4515-ab97-157001f60385-server-conf" (OuterVolumeSpecName: "server-conf") pod "38824624-9325-4515-ab97-157001f60385" (UID: "38824624-9325-4515-ab97-157001f60385"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.060039 4757 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.066755 4757 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38824624-9325-4515-ab97-157001f60385-server-conf\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.066800 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nhrm\" (UniqueName: \"kubernetes.io/projected/d0dd86b6-b617-44fa-aabc-f073e1df12ca-kube-api-access-9nhrm\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.066815 4757 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d0dd86b6-b617-44fa-aabc-f073e1df12ca-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.066830 4757 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.066841 4757 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.066854 4757 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d0dd86b6-b617-44fa-aabc-f073e1df12ca-pod-info\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.066864 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38824624-9325-4515-ab97-157001f60385-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.066878 4757 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.105377 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0dd86b6-b617-44fa-aabc-f073e1df12ca-config-data" (OuterVolumeSpecName: "config-data") pod "d0dd86b6-b617-44fa-aabc-f073e1df12ca" (UID: "d0dd86b6-b617-44fa-aabc-f073e1df12ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.107038 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0dd86b6-b617-44fa-aabc-f073e1df12ca-server-conf" (OuterVolumeSpecName: "server-conf") pod "d0dd86b6-b617-44fa-aabc-f073e1df12ca" (UID: "d0dd86b6-b617-44fa-aabc-f073e1df12ca"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.132699 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-rg9cd"] Dec 16 13:12:40 crc kubenswrapper[4757]: E1216 13:12:40.148272 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0dd86b6-b617-44fa-aabc-f073e1df12ca" containerName="rabbitmq" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.148322 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0dd86b6-b617-44fa-aabc-f073e1df12ca" containerName="rabbitmq" Dec 16 13:12:40 crc kubenswrapper[4757]: E1216 13:12:40.148355 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0dd86b6-b617-44fa-aabc-f073e1df12ca" containerName="setup-container" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.148364 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0dd86b6-b617-44fa-aabc-f073e1df12ca" containerName="setup-container" Dec 16 13:12:40 crc kubenswrapper[4757]: E1216 13:12:40.148376 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38824624-9325-4515-ab97-157001f60385" containerName="rabbitmq" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.148383 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="38824624-9325-4515-ab97-157001f60385" containerName="rabbitmq" Dec 16 13:12:40 crc kubenswrapper[4757]: E1216 13:12:40.148429 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38824624-9325-4515-ab97-157001f60385" containerName="setup-container" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.148437 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="38824624-9325-4515-ab97-157001f60385" containerName="setup-container" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.148665 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0dd86b6-b617-44fa-aabc-f073e1df12ca" containerName="rabbitmq" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.148693 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="38824624-9325-4515-ab97-157001f60385" containerName="rabbitmq" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.149893 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.159483 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.172913 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0dd86b6-b617-44fa-aabc-f073e1df12ca-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.172955 4757 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d0dd86b6-b617-44fa-aabc-f073e1df12ca-server-conf\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.215522 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-rg9cd"] Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.275171 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.275221 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-config\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.275263 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-dns-svc\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.275288 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.275351 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.275368 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.275410 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q69tk\" (UniqueName: \"kubernetes.io/projected/12dbe683-e624-4ffc-b330-e93b3e62ae01-kube-api-access-q69tk\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.282339 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d0dd86b6-b617-44fa-aabc-f073e1df12ca" (UID: "d0dd86b6-b617-44fa-aabc-f073e1df12ca"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.319327 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38824624-9325-4515-ab97-157001f60385-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "38824624-9325-4515-ab97-157001f60385" (UID: "38824624-9325-4515-ab97-157001f60385"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.377813 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.378384 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.378541 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.378666 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q69tk\" (UniqueName: \"kubernetes.io/projected/12dbe683-e624-4ffc-b330-e93b3e62ae01-kube-api-access-q69tk\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.378810 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.378941 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-config\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.379088 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-dns-svc\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.379232 4757 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d0dd86b6-b617-44fa-aabc-f073e1df12ca-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.379306 4757 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38824624-9325-4515-ab97-157001f60385-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.379361 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.380471 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.380834 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-config\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.381157 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-dns-svc\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.381757 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.382108 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.413036 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q69tk\" (UniqueName: \"kubernetes.io/projected/12dbe683-e624-4ffc-b330-e93b3e62ae01-kube-api-access-q69tk\") pod \"dnsmasq-dns-d558885bc-rg9cd\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.509862 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.685563 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.685626 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.685671 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d0dd86b6-b617-44fa-aabc-f073e1df12ca","Type":"ContainerDied","Data":"53c816a9541180ed42b64598c79bff691f6773e5e6db0d1e0bac94f72d14763f"} Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.687508 4757 scope.go:117] "RemoveContainer" containerID="390e2ade2cdd0e741440d82f6842104f35b406a950aaada35ce1e48c36b7c0e7" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.735680 4757 scope.go:117] "RemoveContainer" containerID="8719fe889cba7bf8bb1f4102e84c7999b73788e4f5119eaa39f5b154a8014058" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.749821 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.773055 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.797972 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.829128 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.863318 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.873174 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.928254 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.928590 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.928632 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.929027 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.929052 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.929304 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-fmqf2" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.929473 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.938102 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.941356 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.947502 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.947927 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.948105 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.948495 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.948830 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.949028 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lkljn" Dec 16 13:12:40 crc kubenswrapper[4757]: I1216 13:12:40.951956 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.016134 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/935a64f5-e332-4c06-b4df-f93ec46b7b35-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.016610 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpst7\" (UniqueName: \"kubernetes.io/projected/935a64f5-e332-4c06-b4df-f93ec46b7b35-kube-api-access-lpst7\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.016707 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/935a64f5-e332-4c06-b4df-f93ec46b7b35-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.017230 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.017339 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/935a64f5-e332-4c06-b4df-f93ec46b7b35-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.017431 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/935a64f5-e332-4c06-b4df-f93ec46b7b35-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.017545 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/935a64f5-e332-4c06-b4df-f93ec46b7b35-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.017658 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/935a64f5-e332-4c06-b4df-f93ec46b7b35-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.017758 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/935a64f5-e332-4c06-b4df-f93ec46b7b35-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.017849 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/935a64f5-e332-4c06-b4df-f93ec46b7b35-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.017931 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/935a64f5-e332-4c06-b4df-f93ec46b7b35-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.032724 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38824624-9325-4515-ab97-157001f60385" path="/var/lib/kubelet/pods/38824624-9325-4515-ab97-157001f60385/volumes" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.033603 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0dd86b6-b617-44fa-aabc-f073e1df12ca" path="/var/lib/kubelet/pods/d0dd86b6-b617-44fa-aabc-f073e1df12ca/volumes" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.066979 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.092195 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.120758 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/935a64f5-e332-4c06-b4df-f93ec46b7b35-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.120835 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/268a1573-c10e-42ca-9776-222ed2186693-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.120871 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/935a64f5-e332-4c06-b4df-f93ec46b7b35-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.120903 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/268a1573-c10e-42ca-9776-222ed2186693-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.120939 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/935a64f5-e332-4c06-b4df-f93ec46b7b35-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.120982 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/268a1573-c10e-42ca-9776-222ed2186693-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.121117 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/935a64f5-e332-4c06-b4df-f93ec46b7b35-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.121150 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/268a1573-c10e-42ca-9776-222ed2186693-pod-info\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.121195 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/268a1573-c10e-42ca-9776-222ed2186693-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.121219 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwqs4\" (UniqueName: \"kubernetes.io/projected/268a1573-c10e-42ca-9776-222ed2186693-kube-api-access-jwqs4\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.121252 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/935a64f5-e332-4c06-b4df-f93ec46b7b35-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.121300 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/935a64f5-e332-4c06-b4df-f93ec46b7b35-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.121340 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/268a1573-c10e-42ca-9776-222ed2186693-server-conf\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.121373 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/935a64f5-e332-4c06-b4df-f93ec46b7b35-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.121445 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/935a64f5-e332-4c06-b4df-f93ec46b7b35-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.121486 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/268a1573-c10e-42ca-9776-222ed2186693-config-data\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.121536 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/268a1573-c10e-42ca-9776-222ed2186693-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.121564 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpst7\" (UniqueName: \"kubernetes.io/projected/935a64f5-e332-4c06-b4df-f93ec46b7b35-kube-api-access-lpst7\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.121595 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/268a1573-c10e-42ca-9776-222ed2186693-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.121627 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/935a64f5-e332-4c06-b4df-f93ec46b7b35-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.121662 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.121711 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.122219 4757 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.126594 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/935a64f5-e332-4c06-b4df-f93ec46b7b35-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.131536 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/935a64f5-e332-4c06-b4df-f93ec46b7b35-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.135166 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/935a64f5-e332-4c06-b4df-f93ec46b7b35-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.135666 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/935a64f5-e332-4c06-b4df-f93ec46b7b35-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.135909 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/935a64f5-e332-4c06-b4df-f93ec46b7b35-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.137293 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/935a64f5-e332-4c06-b4df-f93ec46b7b35-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.147407 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/935a64f5-e332-4c06-b4df-f93ec46b7b35-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.156577 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/935a64f5-e332-4c06-b4df-f93ec46b7b35-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.161120 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpst7\" (UniqueName: \"kubernetes.io/projected/935a64f5-e332-4c06-b4df-f93ec46b7b35-kube-api-access-lpst7\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.164473 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/935a64f5-e332-4c06-b4df-f93ec46b7b35-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.188924 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"935a64f5-e332-4c06-b4df-f93ec46b7b35\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: W1216 13:12:41.209681 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12dbe683_e624_4ffc_b330_e93b3e62ae01.slice/crio-155f0d0a8efcacc948f2e6175104edd0ef8c57158490c2677ad7d5470b70af02 WatchSource:0}: Error finding container 155f0d0a8efcacc948f2e6175104edd0ef8c57158490c2677ad7d5470b70af02: Status 404 returned error can't find the container with id 155f0d0a8efcacc948f2e6175104edd0ef8c57158490c2677ad7d5470b70af02 Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.214721 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-rg9cd"] Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.224133 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/268a1573-c10e-42ca-9776-222ed2186693-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.224213 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/268a1573-c10e-42ca-9776-222ed2186693-pod-info\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.224243 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/268a1573-c10e-42ca-9776-222ed2186693-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.224271 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwqs4\" (UniqueName: \"kubernetes.io/projected/268a1573-c10e-42ca-9776-222ed2186693-kube-api-access-jwqs4\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.224310 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/268a1573-c10e-42ca-9776-222ed2186693-server-conf\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.224372 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/268a1573-c10e-42ca-9776-222ed2186693-config-data\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.224405 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/268a1573-c10e-42ca-9776-222ed2186693-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.224440 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/268a1573-c10e-42ca-9776-222ed2186693-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.224477 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.224539 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/268a1573-c10e-42ca-9776-222ed2186693-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.224579 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/268a1573-c10e-42ca-9776-222ed2186693-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.229965 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/268a1573-c10e-42ca-9776-222ed2186693-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.230291 4757 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.235136 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/268a1573-c10e-42ca-9776-222ed2186693-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.230315 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/268a1573-c10e-42ca-9776-222ed2186693-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.238300 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/268a1573-c10e-42ca-9776-222ed2186693-server-conf\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.239765 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/268a1573-c10e-42ca-9776-222ed2186693-config-data\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.240620 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/268a1573-c10e-42ca-9776-222ed2186693-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.243555 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/268a1573-c10e-42ca-9776-222ed2186693-pod-info\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.246668 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/268a1573-c10e-42ca-9776-222ed2186693-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.246746 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/268a1573-c10e-42ca-9776-222ed2186693-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.248641 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwqs4\" (UniqueName: \"kubernetes.io/projected/268a1573-c10e-42ca-9776-222ed2186693-kube-api-access-jwqs4\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.275266 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.277354 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"268a1573-c10e-42ca-9776-222ed2186693\") " pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.284038 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.710525 4757 generic.go:334] "Generic (PLEG): container finished" podID="12dbe683-e624-4ffc-b330-e93b3e62ae01" containerID="5a2bc162bdced52e17c07d2ae4a27a1d5f615c01acdd964d9f20d38f647f60a1" exitCode=0 Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.711371 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-rg9cd" event={"ID":"12dbe683-e624-4ffc-b330-e93b3e62ae01","Type":"ContainerDied","Data":"5a2bc162bdced52e17c07d2ae4a27a1d5f615c01acdd964d9f20d38f647f60a1"} Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.711404 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-rg9cd" event={"ID":"12dbe683-e624-4ffc-b330-e93b3e62ae01","Type":"ContainerStarted","Data":"155f0d0a8efcacc948f2e6175104edd0ef8c57158490c2677ad7d5470b70af02"} Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.714241 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 13:12:41 crc kubenswrapper[4757]: I1216 13:12:41.846359 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 13:12:42 crc kubenswrapper[4757]: I1216 13:12:42.737145 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-rg9cd" event={"ID":"12dbe683-e624-4ffc-b330-e93b3e62ae01","Type":"ContainerStarted","Data":"c031e6afbadf7692292135df3d967cae1eb84f0513291e9743c2c7a49f9e5f68"} Dec 16 13:12:42 crc kubenswrapper[4757]: I1216 13:12:42.737582 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:42 crc kubenswrapper[4757]: I1216 13:12:42.741315 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"268a1573-c10e-42ca-9776-222ed2186693","Type":"ContainerStarted","Data":"e7a5a623c6e35f8a28ded0d5b38d030efcd0933889b9156a2813ece40df8c07b"} Dec 16 13:12:42 crc kubenswrapper[4757]: I1216 13:12:42.749059 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"935a64f5-e332-4c06-b4df-f93ec46b7b35","Type":"ContainerStarted","Data":"bd768fc6da0eba9cf497eb82643796d966e8791c59a8a0308462e1cbe4202354"} Dec 16 13:12:42 crc kubenswrapper[4757]: I1216 13:12:42.765643 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-rg9cd" podStartSLOduration=2.765619692 podStartE2EDuration="2.765619692s" podCreationTimestamp="2025-12-16 13:12:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:12:42.76389251 +0000 UTC m=+1548.191636306" watchObservedRunningTime="2025-12-16 13:12:42.765619692 +0000 UTC m=+1548.193363498" Dec 16 13:12:43 crc kubenswrapper[4757]: I1216 13:12:43.762894 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"268a1573-c10e-42ca-9776-222ed2186693","Type":"ContainerStarted","Data":"5e8e211e5e235d71167650c68c5e32e8447f1d854b9bb1facd80f467e15a99a9"} Dec 16 13:12:43 crc kubenswrapper[4757]: I1216 13:12:43.768491 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"935a64f5-e332-4c06-b4df-f93ec46b7b35","Type":"ContainerStarted","Data":"b4cb99cd98e5e5d62717180df337af3f8e64e2b6748e156142351172395b9fee"} Dec 16 13:12:43 crc kubenswrapper[4757]: I1216 13:12:43.980213 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-krb75" Dec 16 13:12:43 crc kubenswrapper[4757]: I1216 13:12:43.980270 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-krb75" Dec 16 13:12:45 crc kubenswrapper[4757]: I1216 13:12:45.024930 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-krb75" podUID="c200109e-9fd1-4a69-ae82-5a0e29c09ccf" containerName="registry-server" probeResult="failure" output=< Dec 16 13:12:45 crc kubenswrapper[4757]: timeout: failed to connect service ":50051" within 1s Dec 16 13:12:45 crc kubenswrapper[4757]: > Dec 16 13:12:45 crc kubenswrapper[4757]: I1216 13:12:45.825335 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-grwb5"] Dec 16 13:12:45 crc kubenswrapper[4757]: I1216 13:12:45.827358 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-grwb5" Dec 16 13:12:45 crc kubenswrapper[4757]: I1216 13:12:45.832602 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a78507f1-bb33-4ab7-aa4f-339f87904aaa-utilities\") pod \"redhat-marketplace-grwb5\" (UID: \"a78507f1-bb33-4ab7-aa4f-339f87904aaa\") " pod="openshift-marketplace/redhat-marketplace-grwb5" Dec 16 13:12:45 crc kubenswrapper[4757]: I1216 13:12:45.832666 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a78507f1-bb33-4ab7-aa4f-339f87904aaa-catalog-content\") pod \"redhat-marketplace-grwb5\" (UID: \"a78507f1-bb33-4ab7-aa4f-339f87904aaa\") " pod="openshift-marketplace/redhat-marketplace-grwb5" Dec 16 13:12:45 crc kubenswrapper[4757]: I1216 13:12:45.832775 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnzsg\" (UniqueName: \"kubernetes.io/projected/a78507f1-bb33-4ab7-aa4f-339f87904aaa-kube-api-access-lnzsg\") pod \"redhat-marketplace-grwb5\" (UID: \"a78507f1-bb33-4ab7-aa4f-339f87904aaa\") " pod="openshift-marketplace/redhat-marketplace-grwb5" Dec 16 13:12:45 crc kubenswrapper[4757]: I1216 13:12:45.844942 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-grwb5"] Dec 16 13:12:45 crc kubenswrapper[4757]: I1216 13:12:45.934694 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a78507f1-bb33-4ab7-aa4f-339f87904aaa-utilities\") pod \"redhat-marketplace-grwb5\" (UID: \"a78507f1-bb33-4ab7-aa4f-339f87904aaa\") " pod="openshift-marketplace/redhat-marketplace-grwb5" Dec 16 13:12:45 crc kubenswrapper[4757]: I1216 13:12:45.935216 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a78507f1-bb33-4ab7-aa4f-339f87904aaa-catalog-content\") pod \"redhat-marketplace-grwb5\" (UID: \"a78507f1-bb33-4ab7-aa4f-339f87904aaa\") " pod="openshift-marketplace/redhat-marketplace-grwb5" Dec 16 13:12:45 crc kubenswrapper[4757]: I1216 13:12:45.935320 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnzsg\" (UniqueName: \"kubernetes.io/projected/a78507f1-bb33-4ab7-aa4f-339f87904aaa-kube-api-access-lnzsg\") pod \"redhat-marketplace-grwb5\" (UID: \"a78507f1-bb33-4ab7-aa4f-339f87904aaa\") " pod="openshift-marketplace/redhat-marketplace-grwb5" Dec 16 13:12:45 crc kubenswrapper[4757]: I1216 13:12:45.935517 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a78507f1-bb33-4ab7-aa4f-339f87904aaa-catalog-content\") pod \"redhat-marketplace-grwb5\" (UID: \"a78507f1-bb33-4ab7-aa4f-339f87904aaa\") " pod="openshift-marketplace/redhat-marketplace-grwb5" Dec 16 13:12:45 crc kubenswrapper[4757]: I1216 13:12:45.935597 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a78507f1-bb33-4ab7-aa4f-339f87904aaa-utilities\") pod \"redhat-marketplace-grwb5\" (UID: \"a78507f1-bb33-4ab7-aa4f-339f87904aaa\") " pod="openshift-marketplace/redhat-marketplace-grwb5" Dec 16 13:12:45 crc kubenswrapper[4757]: I1216 13:12:45.950951 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:12:45 crc kubenswrapper[4757]: E1216 13:12:45.951211 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:12:45 crc kubenswrapper[4757]: I1216 13:12:45.981064 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnzsg\" (UniqueName: \"kubernetes.io/projected/a78507f1-bb33-4ab7-aa4f-339f87904aaa-kube-api-access-lnzsg\") pod \"redhat-marketplace-grwb5\" (UID: \"a78507f1-bb33-4ab7-aa4f-339f87904aaa\") " pod="openshift-marketplace/redhat-marketplace-grwb5" Dec 16 13:12:46 crc kubenswrapper[4757]: I1216 13:12:46.172041 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-grwb5" Dec 16 13:12:46 crc kubenswrapper[4757]: I1216 13:12:46.634629 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-grwb5"] Dec 16 13:12:46 crc kubenswrapper[4757]: I1216 13:12:46.793124 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grwb5" event={"ID":"a78507f1-bb33-4ab7-aa4f-339f87904aaa","Type":"ContainerStarted","Data":"16ab58b93c134369761139459dbd39fec61d9bc203ad8d3f1cc24985d57ff6ae"} Dec 16 13:12:47 crc kubenswrapper[4757]: I1216 13:12:47.806919 4757 generic.go:334] "Generic (PLEG): container finished" podID="a78507f1-bb33-4ab7-aa4f-339f87904aaa" containerID="a2156cc154f2cf2bd3a92e9e20e75c4e1de18d2e9fe13d2fc65f6abe788f8e86" exitCode=0 Dec 16 13:12:47 crc kubenswrapper[4757]: I1216 13:12:47.807109 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grwb5" event={"ID":"a78507f1-bb33-4ab7-aa4f-339f87904aaa","Type":"ContainerDied","Data":"a2156cc154f2cf2bd3a92e9e20e75c4e1de18d2e9fe13d2fc65f6abe788f8e86"} Dec 16 13:12:49 crc kubenswrapper[4757]: I1216 13:12:49.826636 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grwb5" event={"ID":"a78507f1-bb33-4ab7-aa4f-339f87904aaa","Type":"ContainerStarted","Data":"e47ddca19db00d43459e32b683c21112e02a5b46771bede2673ceb5a3b17f7a3"} Dec 16 13:12:50 crc kubenswrapper[4757]: I1216 13:12:50.511270 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:12:50 crc kubenswrapper[4757]: I1216 13:12:50.581365 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-5gkh8"] Dec 16 13:12:50 crc kubenswrapper[4757]: I1216 13:12:50.581680 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" podUID="b9fe3303-7f1d-4f67-8a51-8276430fb66b" containerName="dnsmasq-dns" containerID="cri-o://8a20fb86812f562f1ac35b77ebad8b3889603f3b2960c45a5c438321058abaae" gracePeriod=10 Dec 16 13:12:50 crc kubenswrapper[4757]: I1216 13:12:50.835982 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d47554775-kw5w5"] Dec 16 13:12:50 crc kubenswrapper[4757]: I1216 13:12:50.840015 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:50 crc kubenswrapper[4757]: I1216 13:12:50.858085 4757 generic.go:334] "Generic (PLEG): container finished" podID="b9fe3303-7f1d-4f67-8a51-8276430fb66b" containerID="8a20fb86812f562f1ac35b77ebad8b3889603f3b2960c45a5c438321058abaae" exitCode=0 Dec 16 13:12:50 crc kubenswrapper[4757]: I1216 13:12:50.858168 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" event={"ID":"b9fe3303-7f1d-4f67-8a51-8276430fb66b","Type":"ContainerDied","Data":"8a20fb86812f562f1ac35b77ebad8b3889603f3b2960c45a5c438321058abaae"} Dec 16 13:12:50 crc kubenswrapper[4757]: I1216 13:12:50.871679 4757 generic.go:334] "Generic (PLEG): container finished" podID="a78507f1-bb33-4ab7-aa4f-339f87904aaa" containerID="e47ddca19db00d43459e32b683c21112e02a5b46771bede2673ceb5a3b17f7a3" exitCode=0 Dec 16 13:12:50 crc kubenswrapper[4757]: I1216 13:12:50.871724 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grwb5" event={"ID":"a78507f1-bb33-4ab7-aa4f-339f87904aaa","Type":"ContainerDied","Data":"e47ddca19db00d43459e32b683c21112e02a5b46771bede2673ceb5a3b17f7a3"} Dec 16 13:12:50 crc kubenswrapper[4757]: I1216 13:12:50.875056 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d47554775-kw5w5"] Dec 16 13:12:50 crc kubenswrapper[4757]: I1216 13:12:50.943897 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7mrh\" (UniqueName: \"kubernetes.io/projected/dae9e574-826f-4521-8b35-5c836c1cde3b-kube-api-access-m7mrh\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:50 crc kubenswrapper[4757]: I1216 13:12:50.943946 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dae9e574-826f-4521-8b35-5c836c1cde3b-openstack-edpm-ipam\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:50 crc kubenswrapper[4757]: I1216 13:12:50.944066 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dae9e574-826f-4521-8b35-5c836c1cde3b-ovsdbserver-nb\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:50 crc kubenswrapper[4757]: I1216 13:12:50.944094 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dae9e574-826f-4521-8b35-5c836c1cde3b-dns-swift-storage-0\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:50 crc kubenswrapper[4757]: I1216 13:12:50.944125 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dae9e574-826f-4521-8b35-5c836c1cde3b-dns-svc\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:50 crc kubenswrapper[4757]: I1216 13:12:50.944212 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dae9e574-826f-4521-8b35-5c836c1cde3b-config\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:50 crc kubenswrapper[4757]: I1216 13:12:50.944299 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dae9e574-826f-4521-8b35-5c836c1cde3b-ovsdbserver-sb\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.048066 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7mrh\" (UniqueName: \"kubernetes.io/projected/dae9e574-826f-4521-8b35-5c836c1cde3b-kube-api-access-m7mrh\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.048120 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dae9e574-826f-4521-8b35-5c836c1cde3b-openstack-edpm-ipam\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.048188 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dae9e574-826f-4521-8b35-5c836c1cde3b-ovsdbserver-nb\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.048208 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dae9e574-826f-4521-8b35-5c836c1cde3b-dns-swift-storage-0\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.048227 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dae9e574-826f-4521-8b35-5c836c1cde3b-dns-svc\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.048256 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dae9e574-826f-4521-8b35-5c836c1cde3b-config\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.048317 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dae9e574-826f-4521-8b35-5c836c1cde3b-ovsdbserver-sb\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.049210 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dae9e574-826f-4521-8b35-5c836c1cde3b-ovsdbserver-sb\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.049361 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dae9e574-826f-4521-8b35-5c836c1cde3b-dns-swift-storage-0\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.051023 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dae9e574-826f-4521-8b35-5c836c1cde3b-dns-svc\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.051815 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dae9e574-826f-4521-8b35-5c836c1cde3b-ovsdbserver-nb\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.055337 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dae9e574-826f-4521-8b35-5c836c1cde3b-openstack-edpm-ipam\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.055360 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dae9e574-826f-4521-8b35-5c836c1cde3b-config\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.072571 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7mrh\" (UniqueName: \"kubernetes.io/projected/dae9e574-826f-4521-8b35-5c836c1cde3b-kube-api-access-m7mrh\") pod \"dnsmasq-dns-6d47554775-kw5w5\" (UID: \"dae9e574-826f-4521-8b35-5c836c1cde3b\") " pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.166759 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.306618 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.460511 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-ovsdbserver-nb\") pod \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.460586 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-ovsdbserver-sb\") pod \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.460621 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-dns-svc\") pod \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.460672 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x8xc\" (UniqueName: \"kubernetes.io/projected/b9fe3303-7f1d-4f67-8a51-8276430fb66b-kube-api-access-2x8xc\") pod \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.460780 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-config\") pod \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.460870 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-dns-swift-storage-0\") pod \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\" (UID: \"b9fe3303-7f1d-4f67-8a51-8276430fb66b\") " Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.469196 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9fe3303-7f1d-4f67-8a51-8276430fb66b-kube-api-access-2x8xc" (OuterVolumeSpecName: "kube-api-access-2x8xc") pod "b9fe3303-7f1d-4f67-8a51-8276430fb66b" (UID: "b9fe3303-7f1d-4f67-8a51-8276430fb66b"). InnerVolumeSpecName "kube-api-access-2x8xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.524209 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b9fe3303-7f1d-4f67-8a51-8276430fb66b" (UID: "b9fe3303-7f1d-4f67-8a51-8276430fb66b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.530041 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b9fe3303-7f1d-4f67-8a51-8276430fb66b" (UID: "b9fe3303-7f1d-4f67-8a51-8276430fb66b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.531951 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-config" (OuterVolumeSpecName: "config") pod "b9fe3303-7f1d-4f67-8a51-8276430fb66b" (UID: "b9fe3303-7f1d-4f67-8a51-8276430fb66b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.538874 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b9fe3303-7f1d-4f67-8a51-8276430fb66b" (UID: "b9fe3303-7f1d-4f67-8a51-8276430fb66b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.544647 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b9fe3303-7f1d-4f67-8a51-8276430fb66b" (UID: "b9fe3303-7f1d-4f67-8a51-8276430fb66b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.566228 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.566474 4757 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.566532 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.566581 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.566629 4757 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9fe3303-7f1d-4f67-8a51-8276430fb66b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.566677 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x8xc\" (UniqueName: \"kubernetes.io/projected/b9fe3303-7f1d-4f67-8a51-8276430fb66b-kube-api-access-2x8xc\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.679615 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d47554775-kw5w5"] Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.886245 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" event={"ID":"b9fe3303-7f1d-4f67-8a51-8276430fb66b","Type":"ContainerDied","Data":"6b8760bbb265a39e403260115ffd4ca19fbe4d17f570a0bdeebe160c0ad18f43"} Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.886579 4757 scope.go:117] "RemoveContainer" containerID="8a20fb86812f562f1ac35b77ebad8b3889603f3b2960c45a5c438321058abaae" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.886283 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-5gkh8" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.890708 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d47554775-kw5w5" event={"ID":"dae9e574-826f-4521-8b35-5c836c1cde3b","Type":"ContainerStarted","Data":"3d2eaaa7bed3eaa6215599ad7a6ede8c3f704614f6744b140f2e71b829a068ef"} Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.910134 4757 scope.go:117] "RemoveContainer" containerID="b580e4f56ee4aa29c2341d3e1f6841f23e5031ff5ae51b95b1dfa171266368f5" Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.954369 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-5gkh8"] Dec 16 13:12:51 crc kubenswrapper[4757]: I1216 13:12:51.975963 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-5gkh8"] Dec 16 13:12:52 crc kubenswrapper[4757]: I1216 13:12:52.902931 4757 generic.go:334] "Generic (PLEG): container finished" podID="dae9e574-826f-4521-8b35-5c836c1cde3b" containerID="e7932e996d8cbebbfb7384b28bc5cf0a3e9ca56d452477aadc2b9dd400b9dbc0" exitCode=0 Dec 16 13:12:52 crc kubenswrapper[4757]: I1216 13:12:52.903043 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d47554775-kw5w5" event={"ID":"dae9e574-826f-4521-8b35-5c836c1cde3b","Type":"ContainerDied","Data":"e7932e996d8cbebbfb7384b28bc5cf0a3e9ca56d452477aadc2b9dd400b9dbc0"} Dec 16 13:12:52 crc kubenswrapper[4757]: I1216 13:12:52.923795 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grwb5" event={"ID":"a78507f1-bb33-4ab7-aa4f-339f87904aaa","Type":"ContainerStarted","Data":"27099d7c8a23b790ad2ff20ca966fdd11c7343e02efd5a83c5f2a91cbdde8709"} Dec 16 13:12:52 crc kubenswrapper[4757]: I1216 13:12:52.970704 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-grwb5" podStartSLOduration=3.742150186 podStartE2EDuration="7.970685543s" podCreationTimestamp="2025-12-16 13:12:45 +0000 UTC" firstStartedPulling="2025-12-16 13:12:47.809114244 +0000 UTC m=+1553.236858040" lastFinishedPulling="2025-12-16 13:12:52.037649601 +0000 UTC m=+1557.465393397" observedRunningTime="2025-12-16 13:12:52.961257779 +0000 UTC m=+1558.389001575" watchObservedRunningTime="2025-12-16 13:12:52.970685543 +0000 UTC m=+1558.398429339" Dec 16 13:12:52 crc kubenswrapper[4757]: I1216 13:12:52.977986 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9fe3303-7f1d-4f67-8a51-8276430fb66b" path="/var/lib/kubelet/pods/b9fe3303-7f1d-4f67-8a51-8276430fb66b/volumes" Dec 16 13:12:53 crc kubenswrapper[4757]: I1216 13:12:53.935162 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d47554775-kw5w5" event={"ID":"dae9e574-826f-4521-8b35-5c836c1cde3b","Type":"ContainerStarted","Data":"c63f9fd02e0ad111b952e49540a0e7485831ac60f7483671fbbf77bcccbe027a"} Dec 16 13:12:53 crc kubenswrapper[4757]: I1216 13:12:53.935577 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:12:53 crc kubenswrapper[4757]: I1216 13:12:53.965158 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d47554775-kw5w5" podStartSLOduration=3.965138976 podStartE2EDuration="3.965138976s" podCreationTimestamp="2025-12-16 13:12:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:12:53.954673266 +0000 UTC m=+1559.382417092" watchObservedRunningTime="2025-12-16 13:12:53.965138976 +0000 UTC m=+1559.392882772" Dec 16 13:12:54 crc kubenswrapper[4757]: I1216 13:12:54.034020 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-krb75" Dec 16 13:12:54 crc kubenswrapper[4757]: I1216 13:12:54.103434 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-krb75" Dec 16 13:12:54 crc kubenswrapper[4757]: I1216 13:12:54.273955 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-krb75"] Dec 16 13:12:55 crc kubenswrapper[4757]: I1216 13:12:55.976144 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-krb75" podUID="c200109e-9fd1-4a69-ae82-5a0e29c09ccf" containerName="registry-server" containerID="cri-o://fd458c717d095f5cb61196b254e1284532a299fc7cca1109492c089ae4da3f0d" gracePeriod=2 Dec 16 13:12:56 crc kubenswrapper[4757]: I1216 13:12:56.173533 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-grwb5" Dec 16 13:12:56 crc kubenswrapper[4757]: I1216 13:12:56.173894 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-grwb5" Dec 16 13:12:56 crc kubenswrapper[4757]: I1216 13:12:56.232920 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-grwb5" Dec 16 13:12:56 crc kubenswrapper[4757]: I1216 13:12:56.550797 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-krb75" Dec 16 13:12:56 crc kubenswrapper[4757]: I1216 13:12:56.683357 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c200109e-9fd1-4a69-ae82-5a0e29c09ccf-catalog-content\") pod \"c200109e-9fd1-4a69-ae82-5a0e29c09ccf\" (UID: \"c200109e-9fd1-4a69-ae82-5a0e29c09ccf\") " Dec 16 13:12:56 crc kubenswrapper[4757]: I1216 13:12:56.683451 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tslw4\" (UniqueName: \"kubernetes.io/projected/c200109e-9fd1-4a69-ae82-5a0e29c09ccf-kube-api-access-tslw4\") pod \"c200109e-9fd1-4a69-ae82-5a0e29c09ccf\" (UID: \"c200109e-9fd1-4a69-ae82-5a0e29c09ccf\") " Dec 16 13:12:56 crc kubenswrapper[4757]: I1216 13:12:56.683622 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c200109e-9fd1-4a69-ae82-5a0e29c09ccf-utilities\") pod \"c200109e-9fd1-4a69-ae82-5a0e29c09ccf\" (UID: \"c200109e-9fd1-4a69-ae82-5a0e29c09ccf\") " Dec 16 13:12:56 crc kubenswrapper[4757]: I1216 13:12:56.684972 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c200109e-9fd1-4a69-ae82-5a0e29c09ccf-utilities" (OuterVolumeSpecName: "utilities") pod "c200109e-9fd1-4a69-ae82-5a0e29c09ccf" (UID: "c200109e-9fd1-4a69-ae82-5a0e29c09ccf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:12:56 crc kubenswrapper[4757]: I1216 13:12:56.689282 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c200109e-9fd1-4a69-ae82-5a0e29c09ccf-kube-api-access-tslw4" (OuterVolumeSpecName: "kube-api-access-tslw4") pod "c200109e-9fd1-4a69-ae82-5a0e29c09ccf" (UID: "c200109e-9fd1-4a69-ae82-5a0e29c09ccf"). InnerVolumeSpecName "kube-api-access-tslw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:12:56 crc kubenswrapper[4757]: I1216 13:12:56.733785 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c200109e-9fd1-4a69-ae82-5a0e29c09ccf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c200109e-9fd1-4a69-ae82-5a0e29c09ccf" (UID: "c200109e-9fd1-4a69-ae82-5a0e29c09ccf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:12:56 crc kubenswrapper[4757]: I1216 13:12:56.786479 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c200109e-9fd1-4a69-ae82-5a0e29c09ccf-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:56 crc kubenswrapper[4757]: I1216 13:12:56.786522 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c200109e-9fd1-4a69-ae82-5a0e29c09ccf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:56 crc kubenswrapper[4757]: I1216 13:12:56.786535 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tslw4\" (UniqueName: \"kubernetes.io/projected/c200109e-9fd1-4a69-ae82-5a0e29c09ccf-kube-api-access-tslw4\") on node \"crc\" DevicePath \"\"" Dec 16 13:12:56 crc kubenswrapper[4757]: I1216 13:12:56.988408 4757 generic.go:334] "Generic (PLEG): container finished" podID="c200109e-9fd1-4a69-ae82-5a0e29c09ccf" containerID="fd458c717d095f5cb61196b254e1284532a299fc7cca1109492c089ae4da3f0d" exitCode=0 Dec 16 13:12:56 crc kubenswrapper[4757]: I1216 13:12:56.989398 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-krb75" Dec 16 13:12:56 crc kubenswrapper[4757]: I1216 13:12:56.990232 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krb75" event={"ID":"c200109e-9fd1-4a69-ae82-5a0e29c09ccf","Type":"ContainerDied","Data":"fd458c717d095f5cb61196b254e1284532a299fc7cca1109492c089ae4da3f0d"} Dec 16 13:12:56 crc kubenswrapper[4757]: I1216 13:12:56.990273 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krb75" event={"ID":"c200109e-9fd1-4a69-ae82-5a0e29c09ccf","Type":"ContainerDied","Data":"2e820ad85211c54bb9e0b1c7764533b9dceafa07b2c68e53cf50bf9284e0cec3"} Dec 16 13:12:56 crc kubenswrapper[4757]: I1216 13:12:56.990315 4757 scope.go:117] "RemoveContainer" containerID="fd458c717d095f5cb61196b254e1284532a299fc7cca1109492c089ae4da3f0d" Dec 16 13:12:57 crc kubenswrapper[4757]: I1216 13:12:57.023742 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-krb75"] Dec 16 13:12:57 crc kubenswrapper[4757]: I1216 13:12:57.028974 4757 scope.go:117] "RemoveContainer" containerID="93840a5aac7e792f7083a4d5276a9603e8b9679efe3882721ab4219f4d791d36" Dec 16 13:12:57 crc kubenswrapper[4757]: I1216 13:12:57.033636 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-krb75"] Dec 16 13:12:57 crc kubenswrapper[4757]: I1216 13:12:57.050246 4757 scope.go:117] "RemoveContainer" containerID="0b25a6b40e14a17aa1b714473a16c43ea5f6583b08f12af52e79b829ad3b58a5" Dec 16 13:12:57 crc kubenswrapper[4757]: I1216 13:12:57.087961 4757 scope.go:117] "RemoveContainer" containerID="fd458c717d095f5cb61196b254e1284532a299fc7cca1109492c089ae4da3f0d" Dec 16 13:12:57 crc kubenswrapper[4757]: E1216 13:12:57.088537 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd458c717d095f5cb61196b254e1284532a299fc7cca1109492c089ae4da3f0d\": container with ID starting with fd458c717d095f5cb61196b254e1284532a299fc7cca1109492c089ae4da3f0d not found: ID does not exist" containerID="fd458c717d095f5cb61196b254e1284532a299fc7cca1109492c089ae4da3f0d" Dec 16 13:12:57 crc kubenswrapper[4757]: I1216 13:12:57.088567 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd458c717d095f5cb61196b254e1284532a299fc7cca1109492c089ae4da3f0d"} err="failed to get container status \"fd458c717d095f5cb61196b254e1284532a299fc7cca1109492c089ae4da3f0d\": rpc error: code = NotFound desc = could not find container \"fd458c717d095f5cb61196b254e1284532a299fc7cca1109492c089ae4da3f0d\": container with ID starting with fd458c717d095f5cb61196b254e1284532a299fc7cca1109492c089ae4da3f0d not found: ID does not exist" Dec 16 13:12:57 crc kubenswrapper[4757]: I1216 13:12:57.088586 4757 scope.go:117] "RemoveContainer" containerID="93840a5aac7e792f7083a4d5276a9603e8b9679efe3882721ab4219f4d791d36" Dec 16 13:12:57 crc kubenswrapper[4757]: E1216 13:12:57.088912 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93840a5aac7e792f7083a4d5276a9603e8b9679efe3882721ab4219f4d791d36\": container with ID starting with 93840a5aac7e792f7083a4d5276a9603e8b9679efe3882721ab4219f4d791d36 not found: ID does not exist" containerID="93840a5aac7e792f7083a4d5276a9603e8b9679efe3882721ab4219f4d791d36" Dec 16 13:12:57 crc kubenswrapper[4757]: I1216 13:12:57.088954 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93840a5aac7e792f7083a4d5276a9603e8b9679efe3882721ab4219f4d791d36"} err="failed to get container status \"93840a5aac7e792f7083a4d5276a9603e8b9679efe3882721ab4219f4d791d36\": rpc error: code = NotFound desc = could not find container \"93840a5aac7e792f7083a4d5276a9603e8b9679efe3882721ab4219f4d791d36\": container with ID starting with 93840a5aac7e792f7083a4d5276a9603e8b9679efe3882721ab4219f4d791d36 not found: ID does not exist" Dec 16 13:12:57 crc kubenswrapper[4757]: I1216 13:12:57.088981 4757 scope.go:117] "RemoveContainer" containerID="0b25a6b40e14a17aa1b714473a16c43ea5f6583b08f12af52e79b829ad3b58a5" Dec 16 13:12:57 crc kubenswrapper[4757]: E1216 13:12:57.089309 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b25a6b40e14a17aa1b714473a16c43ea5f6583b08f12af52e79b829ad3b58a5\": container with ID starting with 0b25a6b40e14a17aa1b714473a16c43ea5f6583b08f12af52e79b829ad3b58a5 not found: ID does not exist" containerID="0b25a6b40e14a17aa1b714473a16c43ea5f6583b08f12af52e79b829ad3b58a5" Dec 16 13:12:57 crc kubenswrapper[4757]: I1216 13:12:57.089335 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b25a6b40e14a17aa1b714473a16c43ea5f6583b08f12af52e79b829ad3b58a5"} err="failed to get container status \"0b25a6b40e14a17aa1b714473a16c43ea5f6583b08f12af52e79b829ad3b58a5\": rpc error: code = NotFound desc = could not find container \"0b25a6b40e14a17aa1b714473a16c43ea5f6583b08f12af52e79b829ad3b58a5\": container with ID starting with 0b25a6b40e14a17aa1b714473a16c43ea5f6583b08f12af52e79b829ad3b58a5 not found: ID does not exist" Dec 16 13:12:58 crc kubenswrapper[4757]: I1216 13:12:58.960249 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c200109e-9fd1-4a69-ae82-5a0e29c09ccf" path="/var/lib/kubelet/pods/c200109e-9fd1-4a69-ae82-5a0e29c09ccf/volumes" Dec 16 13:12:59 crc kubenswrapper[4757]: I1216 13:12:59.948868 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:12:59 crc kubenswrapper[4757]: E1216 13:12:59.949221 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:13:01 crc kubenswrapper[4757]: I1216 13:13:01.168717 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d47554775-kw5w5" Dec 16 13:13:01 crc kubenswrapper[4757]: I1216 13:13:01.245191 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-rg9cd"] Dec 16 13:13:01 crc kubenswrapper[4757]: I1216 13:13:01.245451 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-rg9cd" podUID="12dbe683-e624-4ffc-b330-e93b3e62ae01" containerName="dnsmasq-dns" containerID="cri-o://c031e6afbadf7692292135df3d967cae1eb84f0513291e9743c2c7a49f9e5f68" gracePeriod=10 Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.045744 4757 generic.go:334] "Generic (PLEG): container finished" podID="12dbe683-e624-4ffc-b330-e93b3e62ae01" containerID="c031e6afbadf7692292135df3d967cae1eb84f0513291e9743c2c7a49f9e5f68" exitCode=0 Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.045909 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-rg9cd" event={"ID":"12dbe683-e624-4ffc-b330-e93b3e62ae01","Type":"ContainerDied","Data":"c031e6afbadf7692292135df3d967cae1eb84f0513291e9743c2c7a49f9e5f68"} Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.259688 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.331774 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-config\") pod \"12dbe683-e624-4ffc-b330-e93b3e62ae01\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.331830 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-ovsdbserver-sb\") pod \"12dbe683-e624-4ffc-b330-e93b3e62ae01\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.331903 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-dns-swift-storage-0\") pod \"12dbe683-e624-4ffc-b330-e93b3e62ae01\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.331951 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q69tk\" (UniqueName: \"kubernetes.io/projected/12dbe683-e624-4ffc-b330-e93b3e62ae01-kube-api-access-q69tk\") pod \"12dbe683-e624-4ffc-b330-e93b3e62ae01\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.331994 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-dns-svc\") pod \"12dbe683-e624-4ffc-b330-e93b3e62ae01\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.332052 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-ovsdbserver-nb\") pod \"12dbe683-e624-4ffc-b330-e93b3e62ae01\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.332093 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-openstack-edpm-ipam\") pod \"12dbe683-e624-4ffc-b330-e93b3e62ae01\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.358633 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12dbe683-e624-4ffc-b330-e93b3e62ae01-kube-api-access-q69tk" (OuterVolumeSpecName: "kube-api-access-q69tk") pod "12dbe683-e624-4ffc-b330-e93b3e62ae01" (UID: "12dbe683-e624-4ffc-b330-e93b3e62ae01"). InnerVolumeSpecName "kube-api-access-q69tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.391031 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "12dbe683-e624-4ffc-b330-e93b3e62ae01" (UID: "12dbe683-e624-4ffc-b330-e93b3e62ae01"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.425894 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "12dbe683-e624-4ffc-b330-e93b3e62ae01" (UID: "12dbe683-e624-4ffc-b330-e93b3e62ae01"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.426249 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-config" (OuterVolumeSpecName: "config") pod "12dbe683-e624-4ffc-b330-e93b3e62ae01" (UID: "12dbe683-e624-4ffc-b330-e93b3e62ae01"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.433552 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "12dbe683-e624-4ffc-b330-e93b3e62ae01" (UID: "12dbe683-e624-4ffc-b330-e93b3e62ae01"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.434853 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.434876 4757 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-config\") on node \"crc\" DevicePath \"\"" Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.434885 4757 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.434919 4757 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.434928 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q69tk\" (UniqueName: \"kubernetes.io/projected/12dbe683-e624-4ffc-b330-e93b3e62ae01-kube-api-access-q69tk\") on node \"crc\" DevicePath \"\"" Dec 16 13:13:02 crc kubenswrapper[4757]: E1216 13:13:02.441322 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-dns-svc podName:12dbe683-e624-4ffc-b330-e93b3e62ae01 nodeName:}" failed. No retries permitted until 2025-12-16 13:13:02.941290031 +0000 UTC m=+1568.369033887 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-dns-svc") pod "12dbe683-e624-4ffc-b330-e93b3e62ae01" (UID: "12dbe683-e624-4ffc-b330-e93b3e62ae01") : error deleting /var/lib/kubelet/pods/12dbe683-e624-4ffc-b330-e93b3e62ae01/volume-subpaths: remove /var/lib/kubelet/pods/12dbe683-e624-4ffc-b330-e93b3e62ae01/volume-subpaths: no such file or directory Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.441769 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "12dbe683-e624-4ffc-b330-e93b3e62ae01" (UID: "12dbe683-e624-4ffc-b330-e93b3e62ae01"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.536317 4757 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.950111 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-dns-svc\") pod \"12dbe683-e624-4ffc-b330-e93b3e62ae01\" (UID: \"12dbe683-e624-4ffc-b330-e93b3e62ae01\") " Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.950742 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "12dbe683-e624-4ffc-b330-e93b3e62ae01" (UID: "12dbe683-e624-4ffc-b330-e93b3e62ae01"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:13:02 crc kubenswrapper[4757]: I1216 13:13:02.953745 4757 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12dbe683-e624-4ffc-b330-e93b3e62ae01-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 13:13:03 crc kubenswrapper[4757]: I1216 13:13:03.056178 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-rg9cd" event={"ID":"12dbe683-e624-4ffc-b330-e93b3e62ae01","Type":"ContainerDied","Data":"155f0d0a8efcacc948f2e6175104edd0ef8c57158490c2677ad7d5470b70af02"} Dec 16 13:13:03 crc kubenswrapper[4757]: I1216 13:13:03.056235 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-rg9cd" Dec 16 13:13:03 crc kubenswrapper[4757]: I1216 13:13:03.056238 4757 scope.go:117] "RemoveContainer" containerID="c031e6afbadf7692292135df3d967cae1eb84f0513291e9743c2c7a49f9e5f68" Dec 16 13:13:03 crc kubenswrapper[4757]: I1216 13:13:03.080301 4757 scope.go:117] "RemoveContainer" containerID="5a2bc162bdced52e17c07d2ae4a27a1d5f615c01acdd964d9f20d38f647f60a1" Dec 16 13:13:03 crc kubenswrapper[4757]: I1216 13:13:03.085235 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-rg9cd"] Dec 16 13:13:03 crc kubenswrapper[4757]: I1216 13:13:03.095001 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-rg9cd"] Dec 16 13:13:04 crc kubenswrapper[4757]: I1216 13:13:04.960872 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12dbe683-e624-4ffc-b330-e93b3e62ae01" path="/var/lib/kubelet/pods/12dbe683-e624-4ffc-b330-e93b3e62ae01/volumes" Dec 16 13:13:06 crc kubenswrapper[4757]: I1216 13:13:06.229793 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-grwb5" Dec 16 13:13:06 crc kubenswrapper[4757]: I1216 13:13:06.293469 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-grwb5"] Dec 16 13:13:07 crc kubenswrapper[4757]: I1216 13:13:07.090565 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-grwb5" podUID="a78507f1-bb33-4ab7-aa4f-339f87904aaa" containerName="registry-server" containerID="cri-o://27099d7c8a23b790ad2ff20ca966fdd11c7343e02efd5a83c5f2a91cbdde8709" gracePeriod=2 Dec 16 13:13:07 crc kubenswrapper[4757]: I1216 13:13:07.661454 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-grwb5" Dec 16 13:13:07 crc kubenswrapper[4757]: I1216 13:13:07.752685 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a78507f1-bb33-4ab7-aa4f-339f87904aaa-catalog-content\") pod \"a78507f1-bb33-4ab7-aa4f-339f87904aaa\" (UID: \"a78507f1-bb33-4ab7-aa4f-339f87904aaa\") " Dec 16 13:13:07 crc kubenswrapper[4757]: I1216 13:13:07.752920 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a78507f1-bb33-4ab7-aa4f-339f87904aaa-utilities\") pod \"a78507f1-bb33-4ab7-aa4f-339f87904aaa\" (UID: \"a78507f1-bb33-4ab7-aa4f-339f87904aaa\") " Dec 16 13:13:07 crc kubenswrapper[4757]: I1216 13:13:07.753048 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnzsg\" (UniqueName: \"kubernetes.io/projected/a78507f1-bb33-4ab7-aa4f-339f87904aaa-kube-api-access-lnzsg\") pod \"a78507f1-bb33-4ab7-aa4f-339f87904aaa\" (UID: \"a78507f1-bb33-4ab7-aa4f-339f87904aaa\") " Dec 16 13:13:07 crc kubenswrapper[4757]: I1216 13:13:07.756979 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a78507f1-bb33-4ab7-aa4f-339f87904aaa-utilities" (OuterVolumeSpecName: "utilities") pod "a78507f1-bb33-4ab7-aa4f-339f87904aaa" (UID: "a78507f1-bb33-4ab7-aa4f-339f87904aaa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:13:07 crc kubenswrapper[4757]: I1216 13:13:07.759317 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a78507f1-bb33-4ab7-aa4f-339f87904aaa-kube-api-access-lnzsg" (OuterVolumeSpecName: "kube-api-access-lnzsg") pod "a78507f1-bb33-4ab7-aa4f-339f87904aaa" (UID: "a78507f1-bb33-4ab7-aa4f-339f87904aaa"). InnerVolumeSpecName "kube-api-access-lnzsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:13:07 crc kubenswrapper[4757]: I1216 13:13:07.795127 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a78507f1-bb33-4ab7-aa4f-339f87904aaa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a78507f1-bb33-4ab7-aa4f-339f87904aaa" (UID: "a78507f1-bb33-4ab7-aa4f-339f87904aaa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:13:07 crc kubenswrapper[4757]: I1216 13:13:07.855277 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a78507f1-bb33-4ab7-aa4f-339f87904aaa-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:13:07 crc kubenswrapper[4757]: I1216 13:13:07.855319 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnzsg\" (UniqueName: \"kubernetes.io/projected/a78507f1-bb33-4ab7-aa4f-339f87904aaa-kube-api-access-lnzsg\") on node \"crc\" DevicePath \"\"" Dec 16 13:13:07 crc kubenswrapper[4757]: I1216 13:13:07.855330 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a78507f1-bb33-4ab7-aa4f-339f87904aaa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:13:08 crc kubenswrapper[4757]: I1216 13:13:08.101696 4757 generic.go:334] "Generic (PLEG): container finished" podID="a78507f1-bb33-4ab7-aa4f-339f87904aaa" containerID="27099d7c8a23b790ad2ff20ca966fdd11c7343e02efd5a83c5f2a91cbdde8709" exitCode=0 Dec 16 13:13:08 crc kubenswrapper[4757]: I1216 13:13:08.101802 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grwb5" event={"ID":"a78507f1-bb33-4ab7-aa4f-339f87904aaa","Type":"ContainerDied","Data":"27099d7c8a23b790ad2ff20ca966fdd11c7343e02efd5a83c5f2a91cbdde8709"} Dec 16 13:13:08 crc kubenswrapper[4757]: I1216 13:13:08.101825 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-grwb5" Dec 16 13:13:08 crc kubenswrapper[4757]: I1216 13:13:08.102087 4757 scope.go:117] "RemoveContainer" containerID="27099d7c8a23b790ad2ff20ca966fdd11c7343e02efd5a83c5f2a91cbdde8709" Dec 16 13:13:08 crc kubenswrapper[4757]: I1216 13:13:08.102070 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grwb5" event={"ID":"a78507f1-bb33-4ab7-aa4f-339f87904aaa","Type":"ContainerDied","Data":"16ab58b93c134369761139459dbd39fec61d9bc203ad8d3f1cc24985d57ff6ae"} Dec 16 13:13:08 crc kubenswrapper[4757]: I1216 13:13:08.128156 4757 scope.go:117] "RemoveContainer" containerID="e47ddca19db00d43459e32b683c21112e02a5b46771bede2673ceb5a3b17f7a3" Dec 16 13:13:08 crc kubenswrapper[4757]: I1216 13:13:08.147426 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-grwb5"] Dec 16 13:13:08 crc kubenswrapper[4757]: I1216 13:13:08.158924 4757 scope.go:117] "RemoveContainer" containerID="a2156cc154f2cf2bd3a92e9e20e75c4e1de18d2e9fe13d2fc65f6abe788f8e86" Dec 16 13:13:08 crc kubenswrapper[4757]: I1216 13:13:08.169484 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-grwb5"] Dec 16 13:13:08 crc kubenswrapper[4757]: I1216 13:13:08.214344 4757 scope.go:117] "RemoveContainer" containerID="27099d7c8a23b790ad2ff20ca966fdd11c7343e02efd5a83c5f2a91cbdde8709" Dec 16 13:13:08 crc kubenswrapper[4757]: E1216 13:13:08.214893 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27099d7c8a23b790ad2ff20ca966fdd11c7343e02efd5a83c5f2a91cbdde8709\": container with ID starting with 27099d7c8a23b790ad2ff20ca966fdd11c7343e02efd5a83c5f2a91cbdde8709 not found: ID does not exist" containerID="27099d7c8a23b790ad2ff20ca966fdd11c7343e02efd5a83c5f2a91cbdde8709" Dec 16 13:13:08 crc kubenswrapper[4757]: I1216 13:13:08.214927 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27099d7c8a23b790ad2ff20ca966fdd11c7343e02efd5a83c5f2a91cbdde8709"} err="failed to get container status \"27099d7c8a23b790ad2ff20ca966fdd11c7343e02efd5a83c5f2a91cbdde8709\": rpc error: code = NotFound desc = could not find container \"27099d7c8a23b790ad2ff20ca966fdd11c7343e02efd5a83c5f2a91cbdde8709\": container with ID starting with 27099d7c8a23b790ad2ff20ca966fdd11c7343e02efd5a83c5f2a91cbdde8709 not found: ID does not exist" Dec 16 13:13:08 crc kubenswrapper[4757]: I1216 13:13:08.215155 4757 scope.go:117] "RemoveContainer" containerID="e47ddca19db00d43459e32b683c21112e02a5b46771bede2673ceb5a3b17f7a3" Dec 16 13:13:08 crc kubenswrapper[4757]: E1216 13:13:08.215560 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e47ddca19db00d43459e32b683c21112e02a5b46771bede2673ceb5a3b17f7a3\": container with ID starting with e47ddca19db00d43459e32b683c21112e02a5b46771bede2673ceb5a3b17f7a3 not found: ID does not exist" containerID="e47ddca19db00d43459e32b683c21112e02a5b46771bede2673ceb5a3b17f7a3" Dec 16 13:13:08 crc kubenswrapper[4757]: I1216 13:13:08.215581 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e47ddca19db00d43459e32b683c21112e02a5b46771bede2673ceb5a3b17f7a3"} err="failed to get container status \"e47ddca19db00d43459e32b683c21112e02a5b46771bede2673ceb5a3b17f7a3\": rpc error: code = NotFound desc = could not find container \"e47ddca19db00d43459e32b683c21112e02a5b46771bede2673ceb5a3b17f7a3\": container with ID starting with e47ddca19db00d43459e32b683c21112e02a5b46771bede2673ceb5a3b17f7a3 not found: ID does not exist" Dec 16 13:13:08 crc kubenswrapper[4757]: I1216 13:13:08.215594 4757 scope.go:117] "RemoveContainer" containerID="a2156cc154f2cf2bd3a92e9e20e75c4e1de18d2e9fe13d2fc65f6abe788f8e86" Dec 16 13:13:08 crc kubenswrapper[4757]: E1216 13:13:08.216072 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2156cc154f2cf2bd3a92e9e20e75c4e1de18d2e9fe13d2fc65f6abe788f8e86\": container with ID starting with a2156cc154f2cf2bd3a92e9e20e75c4e1de18d2e9fe13d2fc65f6abe788f8e86 not found: ID does not exist" containerID="a2156cc154f2cf2bd3a92e9e20e75c4e1de18d2e9fe13d2fc65f6abe788f8e86" Dec 16 13:13:08 crc kubenswrapper[4757]: I1216 13:13:08.216102 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2156cc154f2cf2bd3a92e9e20e75c4e1de18d2e9fe13d2fc65f6abe788f8e86"} err="failed to get container status \"a2156cc154f2cf2bd3a92e9e20e75c4e1de18d2e9fe13d2fc65f6abe788f8e86\": rpc error: code = NotFound desc = could not find container \"a2156cc154f2cf2bd3a92e9e20e75c4e1de18d2e9fe13d2fc65f6abe788f8e86\": container with ID starting with a2156cc154f2cf2bd3a92e9e20e75c4e1de18d2e9fe13d2fc65f6abe788f8e86 not found: ID does not exist" Dec 16 13:13:08 crc kubenswrapper[4757]: I1216 13:13:08.958888 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a78507f1-bb33-4ab7-aa4f-339f87904aaa" path="/var/lib/kubelet/pods/a78507f1-bb33-4ab7-aa4f-339f87904aaa/volumes" Dec 16 13:13:11 crc kubenswrapper[4757]: I1216 13:13:11.950060 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:13:11 crc kubenswrapper[4757]: E1216 13:13:11.950697 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:13:16 crc kubenswrapper[4757]: I1216 13:13:16.177481 4757 generic.go:334] "Generic (PLEG): container finished" podID="268a1573-c10e-42ca-9776-222ed2186693" containerID="5e8e211e5e235d71167650c68c5e32e8447f1d854b9bb1facd80f467e15a99a9" exitCode=0 Dec 16 13:13:16 crc kubenswrapper[4757]: I1216 13:13:16.177542 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"268a1573-c10e-42ca-9776-222ed2186693","Type":"ContainerDied","Data":"5e8e211e5e235d71167650c68c5e32e8447f1d854b9bb1facd80f467e15a99a9"} Dec 16 13:13:16 crc kubenswrapper[4757]: I1216 13:13:16.181992 4757 generic.go:334] "Generic (PLEG): container finished" podID="935a64f5-e332-4c06-b4df-f93ec46b7b35" containerID="b4cb99cd98e5e5d62717180df337af3f8e64e2b6748e156142351172395b9fee" exitCode=0 Dec 16 13:13:16 crc kubenswrapper[4757]: I1216 13:13:16.182066 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"935a64f5-e332-4c06-b4df-f93ec46b7b35","Type":"ContainerDied","Data":"b4cb99cd98e5e5d62717180df337af3f8e64e2b6748e156142351172395b9fee"} Dec 16 13:13:17 crc kubenswrapper[4757]: I1216 13:13:17.192819 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"268a1573-c10e-42ca-9776-222ed2186693","Type":"ContainerStarted","Data":"623a5b497ae35053d7a5b7496d107bd24dce708c86122eb0c378c10d6c7f67e0"} Dec 16 13:13:17 crc kubenswrapper[4757]: I1216 13:13:17.194926 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 16 13:13:17 crc kubenswrapper[4757]: I1216 13:13:17.195073 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"935a64f5-e332-4c06-b4df-f93ec46b7b35","Type":"ContainerStarted","Data":"c937e685f6dbe880b45c2e24fd11a358f5814cbb869a45f7b4f25d26d15a2118"} Dec 16 13:13:17 crc kubenswrapper[4757]: I1216 13:13:17.195594 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:13:17 crc kubenswrapper[4757]: I1216 13:13:17.217198 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.21718049 podStartE2EDuration="37.21718049s" podCreationTimestamp="2025-12-16 13:12:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:13:17.214645717 +0000 UTC m=+1582.642389513" watchObservedRunningTime="2025-12-16 13:13:17.21718049 +0000 UTC m=+1582.644924286" Dec 16 13:13:17 crc kubenswrapper[4757]: I1216 13:13:17.250854 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.250832064 podStartE2EDuration="37.250832064s" podCreationTimestamp="2025-12-16 13:12:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:13:17.241990615 +0000 UTC m=+1582.669734411" watchObservedRunningTime="2025-12-16 13:13:17.250832064 +0000 UTC m=+1582.678575860" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.769676 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555"] Dec 16 13:13:20 crc kubenswrapper[4757]: E1216 13:13:20.770566 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a78507f1-bb33-4ab7-aa4f-339f87904aaa" containerName="extract-utilities" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.770585 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78507f1-bb33-4ab7-aa4f-339f87904aaa" containerName="extract-utilities" Dec 16 13:13:20 crc kubenswrapper[4757]: E1216 13:13:20.770602 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a78507f1-bb33-4ab7-aa4f-339f87904aaa" containerName="registry-server" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.770610 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78507f1-bb33-4ab7-aa4f-339f87904aaa" containerName="registry-server" Dec 16 13:13:20 crc kubenswrapper[4757]: E1216 13:13:20.770631 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9fe3303-7f1d-4f67-8a51-8276430fb66b" containerName="dnsmasq-dns" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.770639 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9fe3303-7f1d-4f67-8a51-8276430fb66b" containerName="dnsmasq-dns" Dec 16 13:13:20 crc kubenswrapper[4757]: E1216 13:13:20.770663 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c200109e-9fd1-4a69-ae82-5a0e29c09ccf" containerName="extract-utilities" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.770672 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="c200109e-9fd1-4a69-ae82-5a0e29c09ccf" containerName="extract-utilities" Dec 16 13:13:20 crc kubenswrapper[4757]: E1216 13:13:20.770683 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c200109e-9fd1-4a69-ae82-5a0e29c09ccf" containerName="extract-content" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.770690 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="c200109e-9fd1-4a69-ae82-5a0e29c09ccf" containerName="extract-content" Dec 16 13:13:20 crc kubenswrapper[4757]: E1216 13:13:20.770703 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12dbe683-e624-4ffc-b330-e93b3e62ae01" containerName="init" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.770710 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="12dbe683-e624-4ffc-b330-e93b3e62ae01" containerName="init" Dec 16 13:13:20 crc kubenswrapper[4757]: E1216 13:13:20.770739 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9fe3303-7f1d-4f67-8a51-8276430fb66b" containerName="init" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.770747 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9fe3303-7f1d-4f67-8a51-8276430fb66b" containerName="init" Dec 16 13:13:20 crc kubenswrapper[4757]: E1216 13:13:20.770757 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12dbe683-e624-4ffc-b330-e93b3e62ae01" containerName="dnsmasq-dns" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.770764 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="12dbe683-e624-4ffc-b330-e93b3e62ae01" containerName="dnsmasq-dns" Dec 16 13:13:20 crc kubenswrapper[4757]: E1216 13:13:20.770781 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a78507f1-bb33-4ab7-aa4f-339f87904aaa" containerName="extract-content" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.770788 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78507f1-bb33-4ab7-aa4f-339f87904aaa" containerName="extract-content" Dec 16 13:13:20 crc kubenswrapper[4757]: E1216 13:13:20.770807 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c200109e-9fd1-4a69-ae82-5a0e29c09ccf" containerName="registry-server" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.770814 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="c200109e-9fd1-4a69-ae82-5a0e29c09ccf" containerName="registry-server" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.771085 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9fe3303-7f1d-4f67-8a51-8276430fb66b" containerName="dnsmasq-dns" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.771101 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="c200109e-9fd1-4a69-ae82-5a0e29c09ccf" containerName="registry-server" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.771109 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="a78507f1-bb33-4ab7-aa4f-339f87904aaa" containerName="registry-server" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.771126 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="12dbe683-e624-4ffc-b330-e93b3e62ae01" containerName="dnsmasq-dns" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.771912 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" Dec 16 13:13:20 crc kubenswrapper[4757]: W1216 13:13:20.781419 4757 reflector.go:561] object-"openstack"/"dataplanenodeset-openstack-edpm-ipam": failed to list *v1.Secret: secrets "dataplanenodeset-openstack-edpm-ipam" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 16 13:13:20 crc kubenswrapper[4757]: E1216 13:13:20.781463 4757 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"dataplanenodeset-openstack-edpm-ipam\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"dataplanenodeset-openstack-edpm-ipam\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 13:13:20 crc kubenswrapper[4757]: W1216 13:13:20.781526 4757 reflector.go:561] object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf": failed to list *v1.Secret: secrets "openstack-edpm-ipam-dockercfg-58lpf" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 16 13:13:20 crc kubenswrapper[4757]: E1216 13:13:20.781537 4757 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"openstack-edpm-ipam-dockercfg-58lpf\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openstack-edpm-ipam-dockercfg-58lpf\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 13:13:20 crc kubenswrapper[4757]: W1216 13:13:20.781608 4757 reflector.go:561] object-"openstack"/"dataplane-ansible-ssh-private-key-secret": failed to list *v1.Secret: secrets "dataplane-ansible-ssh-private-key-secret" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 16 13:13:20 crc kubenswrapper[4757]: E1216 13:13:20.781619 4757 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"dataplane-ansible-ssh-private-key-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"dataplane-ansible-ssh-private-key-secret\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 13:13:20 crc kubenswrapper[4757]: W1216 13:13:20.781655 4757 reflector.go:561] object-"openstack"/"openstack-aee-default-env": failed to list *v1.ConfigMap: configmaps "openstack-aee-default-env" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 16 13:13:20 crc kubenswrapper[4757]: E1216 13:13:20.781664 4757 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"openstack-aee-default-env\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openstack-aee-default-env\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.793662 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555"] Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.827282 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-77555\" (UID: \"2c6850c9-0076-4df2-92e7-14521aa14305\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.829880 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-77555\" (UID: \"2c6850c9-0076-4df2-92e7-14521aa14305\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.829983 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqv4f\" (UniqueName: \"kubernetes.io/projected/2c6850c9-0076-4df2-92e7-14521aa14305-kube-api-access-wqv4f\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-77555\" (UID: \"2c6850c9-0076-4df2-92e7-14521aa14305\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.830064 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-77555\" (UID: \"2c6850c9-0076-4df2-92e7-14521aa14305\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.932399 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-77555\" (UID: \"2c6850c9-0076-4df2-92e7-14521aa14305\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.932503 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-77555\" (UID: \"2c6850c9-0076-4df2-92e7-14521aa14305\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.932553 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqv4f\" (UniqueName: \"kubernetes.io/projected/2c6850c9-0076-4df2-92e7-14521aa14305-kube-api-access-wqv4f\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-77555\" (UID: \"2c6850c9-0076-4df2-92e7-14521aa14305\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.932587 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-77555\" (UID: \"2c6850c9-0076-4df2-92e7-14521aa14305\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.944846 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-77555\" (UID: \"2c6850c9-0076-4df2-92e7-14521aa14305\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" Dec 16 13:13:20 crc kubenswrapper[4757]: I1216 13:13:20.963430 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqv4f\" (UniqueName: \"kubernetes.io/projected/2c6850c9-0076-4df2-92e7-14521aa14305-kube-api-access-wqv4f\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-77555\" (UID: \"2c6850c9-0076-4df2-92e7-14521aa14305\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" Dec 16 13:13:21 crc kubenswrapper[4757]: I1216 13:13:21.780120 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 13:13:21 crc kubenswrapper[4757]: I1216 13:13:21.789360 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-77555\" (UID: \"2c6850c9-0076-4df2-92e7-14521aa14305\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" Dec 16 13:13:21 crc kubenswrapper[4757]: I1216 13:13:21.809142 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf" Dec 16 13:13:21 crc kubenswrapper[4757]: E1216 13:13:21.933208 4757 secret.go:188] Couldn't get secret openstack/dataplane-ansible-ssh-private-key-secret: failed to sync secret cache: timed out waiting for the condition Dec 16 13:13:21 crc kubenswrapper[4757]: E1216 13:13:21.933294 4757 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-ssh-key podName:2c6850c9-0076-4df2-92e7-14521aa14305 nodeName:}" failed. No retries permitted until 2025-12-16 13:13:22.433274549 +0000 UTC m=+1587.861018345 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ssh-key" (UniqueName: "kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-ssh-key") pod "repo-setup-edpm-deployment-openstack-edpm-ipam-77555" (UID: "2c6850c9-0076-4df2-92e7-14521aa14305") : failed to sync secret cache: timed out waiting for the condition Dec 16 13:13:21 crc kubenswrapper[4757]: I1216 13:13:21.976686 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:13:22 crc kubenswrapper[4757]: I1216 13:13:22.289285 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 13:13:22 crc kubenswrapper[4757]: I1216 13:13:22.460951 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-77555\" (UID: \"2c6850c9-0076-4df2-92e7-14521aa14305\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" Dec 16 13:13:22 crc kubenswrapper[4757]: I1216 13:13:22.464715 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-77555\" (UID: \"2c6850c9-0076-4df2-92e7-14521aa14305\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" Dec 16 13:13:22 crc kubenswrapper[4757]: I1216 13:13:22.593987 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" Dec 16 13:13:23 crc kubenswrapper[4757]: I1216 13:13:23.621892 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555"] Dec 16 13:13:24 crc kubenswrapper[4757]: I1216 13:13:24.255114 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" event={"ID":"2c6850c9-0076-4df2-92e7-14521aa14305","Type":"ContainerStarted","Data":"d5e6e3124fa75c5880cf20226772dd16afd016e228f900f5f930cc2df52748e1"} Dec 16 13:13:26 crc kubenswrapper[4757]: I1216 13:13:26.959275 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:13:26 crc kubenswrapper[4757]: E1216 13:13:26.960543 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:13:28 crc kubenswrapper[4757]: I1216 13:13:28.303948 4757 scope.go:117] "RemoveContainer" containerID="fef3b9ba90c33271b530e822a9cf915b454fffe64618a682d8d11e0a8e188cd3" Dec 16 13:13:28 crc kubenswrapper[4757]: I1216 13:13:28.340659 4757 scope.go:117] "RemoveContainer" containerID="0aa29fce5aa134e8e87689184651a5c33124e6e03eb9b1f6a548b2059a8fdd88" Dec 16 13:13:28 crc kubenswrapper[4757]: I1216 13:13:28.382240 4757 scope.go:117] "RemoveContainer" containerID="0e85736b68697cab324c82628435878c14a0dc9fd56231019ad728633de68d34" Dec 16 13:13:31 crc kubenswrapper[4757]: I1216 13:13:31.282054 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 16 13:13:31 crc kubenswrapper[4757]: I1216 13:13:31.289083 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 16 13:13:37 crc kubenswrapper[4757]: I1216 13:13:37.391872 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" event={"ID":"2c6850c9-0076-4df2-92e7-14521aa14305","Type":"ContainerStarted","Data":"18ba1fcca682223c2a32b19678978283fd4ab248399f92c2aa28459a360d2a88"} Dec 16 13:13:37 crc kubenswrapper[4757]: I1216 13:13:37.417211 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" podStartSLOduration=4.346737027 podStartE2EDuration="17.417183563s" podCreationTimestamp="2025-12-16 13:13:20 +0000 UTC" firstStartedPulling="2025-12-16 13:13:23.6285507 +0000 UTC m=+1589.056294496" lastFinishedPulling="2025-12-16 13:13:36.698997236 +0000 UTC m=+1602.126741032" observedRunningTime="2025-12-16 13:13:37.407160735 +0000 UTC m=+1602.834904521" watchObservedRunningTime="2025-12-16 13:13:37.417183563 +0000 UTC m=+1602.844927359" Dec 16 13:13:38 crc kubenswrapper[4757]: I1216 13:13:38.949660 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:13:38 crc kubenswrapper[4757]: E1216 13:13:38.949923 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:13:51 crc kubenswrapper[4757]: I1216 13:13:51.525859 4757 generic.go:334] "Generic (PLEG): container finished" podID="2c6850c9-0076-4df2-92e7-14521aa14305" containerID="18ba1fcca682223c2a32b19678978283fd4ab248399f92c2aa28459a360d2a88" exitCode=0 Dec 16 13:13:51 crc kubenswrapper[4757]: I1216 13:13:51.525963 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" event={"ID":"2c6850c9-0076-4df2-92e7-14521aa14305","Type":"ContainerDied","Data":"18ba1fcca682223c2a32b19678978283fd4ab248399f92c2aa28459a360d2a88"} Dec 16 13:13:52 crc kubenswrapper[4757]: I1216 13:13:52.974603 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.077664 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-inventory\") pod \"2c6850c9-0076-4df2-92e7-14521aa14305\" (UID: \"2c6850c9-0076-4df2-92e7-14521aa14305\") " Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.077744 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-repo-setup-combined-ca-bundle\") pod \"2c6850c9-0076-4df2-92e7-14521aa14305\" (UID: \"2c6850c9-0076-4df2-92e7-14521aa14305\") " Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.077780 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqv4f\" (UniqueName: \"kubernetes.io/projected/2c6850c9-0076-4df2-92e7-14521aa14305-kube-api-access-wqv4f\") pod \"2c6850c9-0076-4df2-92e7-14521aa14305\" (UID: \"2c6850c9-0076-4df2-92e7-14521aa14305\") " Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.077962 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-ssh-key\") pod \"2c6850c9-0076-4df2-92e7-14521aa14305\" (UID: \"2c6850c9-0076-4df2-92e7-14521aa14305\") " Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.083313 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "2c6850c9-0076-4df2-92e7-14521aa14305" (UID: "2c6850c9-0076-4df2-92e7-14521aa14305"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.084139 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c6850c9-0076-4df2-92e7-14521aa14305-kube-api-access-wqv4f" (OuterVolumeSpecName: "kube-api-access-wqv4f") pod "2c6850c9-0076-4df2-92e7-14521aa14305" (UID: "2c6850c9-0076-4df2-92e7-14521aa14305"). InnerVolumeSpecName "kube-api-access-wqv4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.103631 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-inventory" (OuterVolumeSpecName: "inventory") pod "2c6850c9-0076-4df2-92e7-14521aa14305" (UID: "2c6850c9-0076-4df2-92e7-14521aa14305"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.120401 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2c6850c9-0076-4df2-92e7-14521aa14305" (UID: "2c6850c9-0076-4df2-92e7-14521aa14305"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.180637 4757 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.180677 4757 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.180694 4757 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6850c9-0076-4df2-92e7-14521aa14305-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.180710 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqv4f\" (UniqueName: \"kubernetes.io/projected/2c6850c9-0076-4df2-92e7-14521aa14305-kube-api-access-wqv4f\") on node \"crc\" DevicePath \"\"" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.547512 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" event={"ID":"2c6850c9-0076-4df2-92e7-14521aa14305","Type":"ContainerDied","Data":"d5e6e3124fa75c5880cf20226772dd16afd016e228f900f5f930cc2df52748e1"} Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.547565 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5e6e3124fa75c5880cf20226772dd16afd016e228f900f5f930cc2df52748e1" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.547569 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-77555" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.646288 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p"] Dec 16 13:13:53 crc kubenswrapper[4757]: E1216 13:13:53.646722 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c6850c9-0076-4df2-92e7-14521aa14305" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.646747 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6850c9-0076-4df2-92e7-14521aa14305" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.647040 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c6850c9-0076-4df2-92e7-14521aa14305" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.648097 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.649835 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.651179 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.651520 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.651823 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.661658 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p"] Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.800778 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fqll\" (UniqueName: \"kubernetes.io/projected/95440627-f74c-45d0-a168-e8c37e8e7122-kube-api-access-7fqll\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-spb5p\" (UID: \"95440627-f74c-45d0-a168-e8c37e8e7122\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.800860 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95440627-f74c-45d0-a168-e8c37e8e7122-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-spb5p\" (UID: \"95440627-f74c-45d0-a168-e8c37e8e7122\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.800980 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95440627-f74c-45d0-a168-e8c37e8e7122-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-spb5p\" (UID: \"95440627-f74c-45d0-a168-e8c37e8e7122\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.903217 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95440627-f74c-45d0-a168-e8c37e8e7122-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-spb5p\" (UID: \"95440627-f74c-45d0-a168-e8c37e8e7122\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.903283 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95440627-f74c-45d0-a168-e8c37e8e7122-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-spb5p\" (UID: \"95440627-f74c-45d0-a168-e8c37e8e7122\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.903470 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fqll\" (UniqueName: \"kubernetes.io/projected/95440627-f74c-45d0-a168-e8c37e8e7122-kube-api-access-7fqll\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-spb5p\" (UID: \"95440627-f74c-45d0-a168-e8c37e8e7122\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.911765 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95440627-f74c-45d0-a168-e8c37e8e7122-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-spb5p\" (UID: \"95440627-f74c-45d0-a168-e8c37e8e7122\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.913374 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95440627-f74c-45d0-a168-e8c37e8e7122-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-spb5p\" (UID: \"95440627-f74c-45d0-a168-e8c37e8e7122\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.927176 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fqll\" (UniqueName: \"kubernetes.io/projected/95440627-f74c-45d0-a168-e8c37e8e7122-kube-api-access-7fqll\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-spb5p\" (UID: \"95440627-f74c-45d0-a168-e8c37e8e7122\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.950293 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:13:53 crc kubenswrapper[4757]: E1216 13:13:53.950591 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:13:53 crc kubenswrapper[4757]: I1216 13:13:53.967911 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p" Dec 16 13:13:54 crc kubenswrapper[4757]: I1216 13:13:54.500538 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p"] Dec 16 13:13:54 crc kubenswrapper[4757]: I1216 13:13:54.557850 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p" event={"ID":"95440627-f74c-45d0-a168-e8c37e8e7122","Type":"ContainerStarted","Data":"d89abbc544068e4a5360bf28d955d372f93ce4d3e4f3b5e1f7f5147bb7e56228"} Dec 16 13:13:55 crc kubenswrapper[4757]: I1216 13:13:55.419836 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:13:56 crc kubenswrapper[4757]: I1216 13:13:56.576593 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p" event={"ID":"95440627-f74c-45d0-a168-e8c37e8e7122","Type":"ContainerStarted","Data":"5b6ac9aad9f1be9cd489c6c5c89a2c6d4ba2964a19b2875d2c356efe9a77186a"} Dec 16 13:13:56 crc kubenswrapper[4757]: I1216 13:13:56.612693 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p" podStartSLOduration=2.7053642289999997 podStartE2EDuration="3.612669733s" podCreationTimestamp="2025-12-16 13:13:53 +0000 UTC" firstStartedPulling="2025-12-16 13:13:54.509177586 +0000 UTC m=+1619.936921382" lastFinishedPulling="2025-12-16 13:13:55.41648309 +0000 UTC m=+1620.844226886" observedRunningTime="2025-12-16 13:13:56.592373539 +0000 UTC m=+1622.020117335" watchObservedRunningTime="2025-12-16 13:13:56.612669733 +0000 UTC m=+1622.040413529" Dec 16 13:13:58 crc kubenswrapper[4757]: I1216 13:13:58.594291 4757 generic.go:334] "Generic (PLEG): container finished" podID="95440627-f74c-45d0-a168-e8c37e8e7122" containerID="5b6ac9aad9f1be9cd489c6c5c89a2c6d4ba2964a19b2875d2c356efe9a77186a" exitCode=0 Dec 16 13:13:58 crc kubenswrapper[4757]: I1216 13:13:58.594363 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p" event={"ID":"95440627-f74c-45d0-a168-e8c37e8e7122","Type":"ContainerDied","Data":"5b6ac9aad9f1be9cd489c6c5c89a2c6d4ba2964a19b2875d2c356efe9a77186a"} Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.024068 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.122633 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fqll\" (UniqueName: \"kubernetes.io/projected/95440627-f74c-45d0-a168-e8c37e8e7122-kube-api-access-7fqll\") pod \"95440627-f74c-45d0-a168-e8c37e8e7122\" (UID: \"95440627-f74c-45d0-a168-e8c37e8e7122\") " Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.122694 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95440627-f74c-45d0-a168-e8c37e8e7122-inventory\") pod \"95440627-f74c-45d0-a168-e8c37e8e7122\" (UID: \"95440627-f74c-45d0-a168-e8c37e8e7122\") " Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.122734 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95440627-f74c-45d0-a168-e8c37e8e7122-ssh-key\") pod \"95440627-f74c-45d0-a168-e8c37e8e7122\" (UID: \"95440627-f74c-45d0-a168-e8c37e8e7122\") " Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.128927 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95440627-f74c-45d0-a168-e8c37e8e7122-kube-api-access-7fqll" (OuterVolumeSpecName: "kube-api-access-7fqll") pod "95440627-f74c-45d0-a168-e8c37e8e7122" (UID: "95440627-f74c-45d0-a168-e8c37e8e7122"). InnerVolumeSpecName "kube-api-access-7fqll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.156073 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95440627-f74c-45d0-a168-e8c37e8e7122-inventory" (OuterVolumeSpecName: "inventory") pod "95440627-f74c-45d0-a168-e8c37e8e7122" (UID: "95440627-f74c-45d0-a168-e8c37e8e7122"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.156708 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95440627-f74c-45d0-a168-e8c37e8e7122-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "95440627-f74c-45d0-a168-e8c37e8e7122" (UID: "95440627-f74c-45d0-a168-e8c37e8e7122"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.224988 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fqll\" (UniqueName: \"kubernetes.io/projected/95440627-f74c-45d0-a168-e8c37e8e7122-kube-api-access-7fqll\") on node \"crc\" DevicePath \"\"" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.225036 4757 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95440627-f74c-45d0-a168-e8c37e8e7122-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.225046 4757 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95440627-f74c-45d0-a168-e8c37e8e7122-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.612932 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p" event={"ID":"95440627-f74c-45d0-a168-e8c37e8e7122","Type":"ContainerDied","Data":"d89abbc544068e4a5360bf28d955d372f93ce4d3e4f3b5e1f7f5147bb7e56228"} Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.612969 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d89abbc544068e4a5360bf28d955d372f93ce4d3e4f3b5e1f7f5147bb7e56228" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.613182 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-spb5p" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.704505 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg"] Dec 16 13:14:00 crc kubenswrapper[4757]: E1216 13:14:00.705193 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95440627-f74c-45d0-a168-e8c37e8e7122" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.705274 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="95440627-f74c-45d0-a168-e8c37e8e7122" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.705602 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="95440627-f74c-45d0-a168-e8c37e8e7122" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.706323 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.709581 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.709987 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.710062 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.714661 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.738585 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg"] Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.838513 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg\" (UID: \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.838778 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg\" (UID: \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.839249 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg\" (UID: \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.839411 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jbxn\" (UniqueName: \"kubernetes.io/projected/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-kube-api-access-7jbxn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg\" (UID: \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.941543 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jbxn\" (UniqueName: \"kubernetes.io/projected/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-kube-api-access-7jbxn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg\" (UID: \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.941704 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg\" (UID: \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.941764 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg\" (UID: \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.941812 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg\" (UID: \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.954111 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg\" (UID: \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.956584 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg\" (UID: \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.957207 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg\" (UID: \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" Dec 16 13:14:00 crc kubenswrapper[4757]: I1216 13:14:00.997703 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jbxn\" (UniqueName: \"kubernetes.io/projected/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-kube-api-access-7jbxn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg\" (UID: \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" Dec 16 13:14:01 crc kubenswrapper[4757]: I1216 13:14:01.039911 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" Dec 16 13:14:01 crc kubenswrapper[4757]: I1216 13:14:01.657338 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg"] Dec 16 13:14:01 crc kubenswrapper[4757]: I1216 13:14:01.666281 4757 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 13:14:02 crc kubenswrapper[4757]: I1216 13:14:02.633691 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" event={"ID":"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1","Type":"ContainerStarted","Data":"e00beda87a0b8890f15a52308c914d6f37cc5d212a823eb673fc6fa4815e6c8a"} Dec 16 13:14:02 crc kubenswrapper[4757]: I1216 13:14:02.634036 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" event={"ID":"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1","Type":"ContainerStarted","Data":"24b09a47238f432f401780499954c7adef079756d1073e9378b18c9ebfd227cd"} Dec 16 13:14:02 crc kubenswrapper[4757]: I1216 13:14:02.659071 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" podStartSLOduration=2.078111331 podStartE2EDuration="2.659050237s" podCreationTimestamp="2025-12-16 13:14:00 +0000 UTC" firstStartedPulling="2025-12-16 13:14:01.66605373 +0000 UTC m=+1627.093797526" lastFinishedPulling="2025-12-16 13:14:02.246992636 +0000 UTC m=+1627.674736432" observedRunningTime="2025-12-16 13:14:02.651785167 +0000 UTC m=+1628.079528953" watchObservedRunningTime="2025-12-16 13:14:02.659050237 +0000 UTC m=+1628.086794033" Dec 16 13:14:08 crc kubenswrapper[4757]: I1216 13:14:08.953333 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:14:08 crc kubenswrapper[4757]: E1216 13:14:08.954728 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:14:21 crc kubenswrapper[4757]: I1216 13:14:21.949401 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:14:21 crc kubenswrapper[4757]: E1216 13:14:21.950706 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:14:34 crc kubenswrapper[4757]: I1216 13:14:34.956342 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:14:34 crc kubenswrapper[4757]: E1216 13:14:34.958809 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:14:48 crc kubenswrapper[4757]: I1216 13:14:48.949281 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:14:48 crc kubenswrapper[4757]: E1216 13:14:48.950181 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:14:59 crc kubenswrapper[4757]: I1216 13:14:59.949398 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:14:59 crc kubenswrapper[4757]: E1216 13:14:59.950132 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:15:00 crc kubenswrapper[4757]: I1216 13:15:00.348111 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc"] Dec 16 13:15:00 crc kubenswrapper[4757]: I1216 13:15:00.349332 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc" Dec 16 13:15:00 crc kubenswrapper[4757]: I1216 13:15:00.351404 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 13:15:00 crc kubenswrapper[4757]: I1216 13:15:00.363278 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc"] Dec 16 13:15:00 crc kubenswrapper[4757]: I1216 13:15:00.364336 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 13:15:00 crc kubenswrapper[4757]: I1216 13:15:00.436213 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b926k\" (UniqueName: \"kubernetes.io/projected/692db28f-b3cf-43b3-8822-fc5898543119-kube-api-access-b926k\") pod \"collect-profiles-29431515-hw5rc\" (UID: \"692db28f-b3cf-43b3-8822-fc5898543119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc" Dec 16 13:15:00 crc kubenswrapper[4757]: I1216 13:15:00.436380 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/692db28f-b3cf-43b3-8822-fc5898543119-config-volume\") pod \"collect-profiles-29431515-hw5rc\" (UID: \"692db28f-b3cf-43b3-8822-fc5898543119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc" Dec 16 13:15:00 crc kubenswrapper[4757]: I1216 13:15:00.436404 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/692db28f-b3cf-43b3-8822-fc5898543119-secret-volume\") pod \"collect-profiles-29431515-hw5rc\" (UID: \"692db28f-b3cf-43b3-8822-fc5898543119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc" Dec 16 13:15:00 crc kubenswrapper[4757]: I1216 13:15:00.539367 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/692db28f-b3cf-43b3-8822-fc5898543119-config-volume\") pod \"collect-profiles-29431515-hw5rc\" (UID: \"692db28f-b3cf-43b3-8822-fc5898543119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc" Dec 16 13:15:00 crc kubenswrapper[4757]: I1216 13:15:00.539439 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/692db28f-b3cf-43b3-8822-fc5898543119-secret-volume\") pod \"collect-profiles-29431515-hw5rc\" (UID: \"692db28f-b3cf-43b3-8822-fc5898543119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc" Dec 16 13:15:00 crc kubenswrapper[4757]: I1216 13:15:00.539653 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b926k\" (UniqueName: \"kubernetes.io/projected/692db28f-b3cf-43b3-8822-fc5898543119-kube-api-access-b926k\") pod \"collect-profiles-29431515-hw5rc\" (UID: \"692db28f-b3cf-43b3-8822-fc5898543119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc" Dec 16 13:15:00 crc kubenswrapper[4757]: I1216 13:15:00.541481 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/692db28f-b3cf-43b3-8822-fc5898543119-config-volume\") pod \"collect-profiles-29431515-hw5rc\" (UID: \"692db28f-b3cf-43b3-8822-fc5898543119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc" Dec 16 13:15:00 crc kubenswrapper[4757]: I1216 13:15:00.549707 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/692db28f-b3cf-43b3-8822-fc5898543119-secret-volume\") pod \"collect-profiles-29431515-hw5rc\" (UID: \"692db28f-b3cf-43b3-8822-fc5898543119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc" Dec 16 13:15:00 crc kubenswrapper[4757]: I1216 13:15:00.557970 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b926k\" (UniqueName: \"kubernetes.io/projected/692db28f-b3cf-43b3-8822-fc5898543119-kube-api-access-b926k\") pod \"collect-profiles-29431515-hw5rc\" (UID: \"692db28f-b3cf-43b3-8822-fc5898543119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc" Dec 16 13:15:00 crc kubenswrapper[4757]: I1216 13:15:00.669367 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc" Dec 16 13:15:01 crc kubenswrapper[4757]: I1216 13:15:01.187991 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc"] Dec 16 13:15:02 crc kubenswrapper[4757]: I1216 13:15:02.156187 4757 generic.go:334] "Generic (PLEG): container finished" podID="692db28f-b3cf-43b3-8822-fc5898543119" containerID="039e694a3cb005f960d1484fa244d8eff9e777fdc7c40fcef9e53d7076d20204" exitCode=0 Dec 16 13:15:02 crc kubenswrapper[4757]: I1216 13:15:02.156265 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc" event={"ID":"692db28f-b3cf-43b3-8822-fc5898543119","Type":"ContainerDied","Data":"039e694a3cb005f960d1484fa244d8eff9e777fdc7c40fcef9e53d7076d20204"} Dec 16 13:15:02 crc kubenswrapper[4757]: I1216 13:15:02.156519 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc" event={"ID":"692db28f-b3cf-43b3-8822-fc5898543119","Type":"ContainerStarted","Data":"87543c49eed2824fc335822cfc136fed0da7cdb329c0bd48b2315a58a2e5487c"} Dec 16 13:15:03 crc kubenswrapper[4757]: I1216 13:15:03.486523 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc" Dec 16 13:15:03 crc kubenswrapper[4757]: I1216 13:15:03.607710 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/692db28f-b3cf-43b3-8822-fc5898543119-secret-volume\") pod \"692db28f-b3cf-43b3-8822-fc5898543119\" (UID: \"692db28f-b3cf-43b3-8822-fc5898543119\") " Dec 16 13:15:03 crc kubenswrapper[4757]: I1216 13:15:03.607747 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b926k\" (UniqueName: \"kubernetes.io/projected/692db28f-b3cf-43b3-8822-fc5898543119-kube-api-access-b926k\") pod \"692db28f-b3cf-43b3-8822-fc5898543119\" (UID: \"692db28f-b3cf-43b3-8822-fc5898543119\") " Dec 16 13:15:03 crc kubenswrapper[4757]: I1216 13:15:03.607806 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/692db28f-b3cf-43b3-8822-fc5898543119-config-volume\") pod \"692db28f-b3cf-43b3-8822-fc5898543119\" (UID: \"692db28f-b3cf-43b3-8822-fc5898543119\") " Dec 16 13:15:03 crc kubenswrapper[4757]: I1216 13:15:03.608567 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/692db28f-b3cf-43b3-8822-fc5898543119-config-volume" (OuterVolumeSpecName: "config-volume") pod "692db28f-b3cf-43b3-8822-fc5898543119" (UID: "692db28f-b3cf-43b3-8822-fc5898543119"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:15:03 crc kubenswrapper[4757]: I1216 13:15:03.613595 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/692db28f-b3cf-43b3-8822-fc5898543119-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "692db28f-b3cf-43b3-8822-fc5898543119" (UID: "692db28f-b3cf-43b3-8822-fc5898543119"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:15:03 crc kubenswrapper[4757]: I1216 13:15:03.618528 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/692db28f-b3cf-43b3-8822-fc5898543119-kube-api-access-b926k" (OuterVolumeSpecName: "kube-api-access-b926k") pod "692db28f-b3cf-43b3-8822-fc5898543119" (UID: "692db28f-b3cf-43b3-8822-fc5898543119"). InnerVolumeSpecName "kube-api-access-b926k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:15:03 crc kubenswrapper[4757]: I1216 13:15:03.711736 4757 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/692db28f-b3cf-43b3-8822-fc5898543119-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 13:15:03 crc kubenswrapper[4757]: I1216 13:15:03.711782 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b926k\" (UniqueName: \"kubernetes.io/projected/692db28f-b3cf-43b3-8822-fc5898543119-kube-api-access-b926k\") on node \"crc\" DevicePath \"\"" Dec 16 13:15:03 crc kubenswrapper[4757]: I1216 13:15:03.711796 4757 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/692db28f-b3cf-43b3-8822-fc5898543119-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 13:15:04 crc kubenswrapper[4757]: I1216 13:15:04.173881 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc" Dec 16 13:15:04 crc kubenswrapper[4757]: I1216 13:15:04.173866 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc" event={"ID":"692db28f-b3cf-43b3-8822-fc5898543119","Type":"ContainerDied","Data":"87543c49eed2824fc335822cfc136fed0da7cdb329c0bd48b2315a58a2e5487c"} Dec 16 13:15:04 crc kubenswrapper[4757]: I1216 13:15:04.174294 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87543c49eed2824fc335822cfc136fed0da7cdb329c0bd48b2315a58a2e5487c" Dec 16 13:15:13 crc kubenswrapper[4757]: I1216 13:15:13.948935 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:15:13 crc kubenswrapper[4757]: E1216 13:15:13.949881 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:15:28 crc kubenswrapper[4757]: I1216 13:15:28.949210 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:15:28 crc kubenswrapper[4757]: E1216 13:15:28.950079 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:15:42 crc kubenswrapper[4757]: I1216 13:15:42.949259 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:15:42 crc kubenswrapper[4757]: E1216 13:15:42.950139 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:15:48 crc kubenswrapper[4757]: I1216 13:15:48.060071 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8rhc9"] Dec 16 13:15:48 crc kubenswrapper[4757]: I1216 13:15:48.069268 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8rhc9"] Dec 16 13:15:48 crc kubenswrapper[4757]: I1216 13:15:48.959392 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fbc2012-fd22-4292-91b2-423e7ca2f2b6" path="/var/lib/kubelet/pods/5fbc2012-fd22-4292-91b2-423e7ca2f2b6/volumes" Dec 16 13:15:49 crc kubenswrapper[4757]: I1216 13:15:49.030685 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5cbb-account-create-update-d89ll"] Dec 16 13:15:49 crc kubenswrapper[4757]: I1216 13:15:49.042171 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-gwvrz"] Dec 16 13:15:49 crc kubenswrapper[4757]: I1216 13:15:49.053175 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-16e6-account-create-update-5mfht"] Dec 16 13:15:49 crc kubenswrapper[4757]: I1216 13:15:49.062616 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-16e6-account-create-update-5mfht"] Dec 16 13:15:49 crc kubenswrapper[4757]: I1216 13:15:49.070887 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-gwvrz"] Dec 16 13:15:49 crc kubenswrapper[4757]: I1216 13:15:49.079048 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5cbb-account-create-update-d89ll"] Dec 16 13:15:50 crc kubenswrapper[4757]: I1216 13:15:50.031983 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-s6pnm"] Dec 16 13:15:50 crc kubenswrapper[4757]: I1216 13:15:50.039963 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-e53a-account-create-update-9mz79"] Dec 16 13:15:50 crc kubenswrapper[4757]: I1216 13:15:50.047803 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-s6pnm"] Dec 16 13:15:50 crc kubenswrapper[4757]: I1216 13:15:50.059035 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-e53a-account-create-update-9mz79"] Dec 16 13:15:50 crc kubenswrapper[4757]: I1216 13:15:50.975839 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b512b5d-218c-4612-8030-536b8b21ab7d" path="/var/lib/kubelet/pods/0b512b5d-218c-4612-8030-536b8b21ab7d/volumes" Dec 16 13:15:50 crc kubenswrapper[4757]: I1216 13:15:50.977780 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a2f7431-0cc4-4be1-ba70-36ee0333bc0d" path="/var/lib/kubelet/pods/1a2f7431-0cc4-4be1-ba70-36ee0333bc0d/volumes" Dec 16 13:15:50 crc kubenswrapper[4757]: I1216 13:15:50.979516 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3061dea3-7e30-407e-a77b-b696ae6710a1" path="/var/lib/kubelet/pods/3061dea3-7e30-407e-a77b-b696ae6710a1/volumes" Dec 16 13:15:50 crc kubenswrapper[4757]: I1216 13:15:50.981881 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3800f971-f8d9-46cd-9b10-36095059a766" path="/var/lib/kubelet/pods/3800f971-f8d9-46cd-9b10-36095059a766/volumes" Dec 16 13:15:50 crc kubenswrapper[4757]: I1216 13:15:50.984912 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63cbd0d4-eb87-4ee1-a6b1-cfe327223d42" path="/var/lib/kubelet/pods/63cbd0d4-eb87-4ee1-a6b1-cfe327223d42/volumes" Dec 16 13:15:54 crc kubenswrapper[4757]: I1216 13:15:54.955588 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:15:54 crc kubenswrapper[4757]: E1216 13:15:54.955925 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:16:06 crc kubenswrapper[4757]: I1216 13:16:06.949986 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:16:06 crc kubenswrapper[4757]: E1216 13:16:06.950876 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:16:20 crc kubenswrapper[4757]: I1216 13:16:20.948987 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:16:20 crc kubenswrapper[4757]: E1216 13:16:20.951329 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:16:23 crc kubenswrapper[4757]: I1216 13:16:23.040207 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-9l9bq"] Dec 16 13:16:23 crc kubenswrapper[4757]: I1216 13:16:23.049685 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-e7fa-account-create-update-4hn76"] Dec 16 13:16:23 crc kubenswrapper[4757]: I1216 13:16:23.063004 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-phhth"] Dec 16 13:16:23 crc kubenswrapper[4757]: I1216 13:16:23.070946 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-ctbr2"] Dec 16 13:16:23 crc kubenswrapper[4757]: I1216 13:16:23.082306 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-9l9bq"] Dec 16 13:16:23 crc kubenswrapper[4757]: I1216 13:16:23.093497 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-ctbr2"] Dec 16 13:16:23 crc kubenswrapper[4757]: I1216 13:16:23.100739 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-phhth"] Dec 16 13:16:23 crc kubenswrapper[4757]: I1216 13:16:23.107950 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-e7fa-account-create-update-4hn76"] Dec 16 13:16:24 crc kubenswrapper[4757]: I1216 13:16:24.036885 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-35f9-account-create-update-8lns5"] Dec 16 13:16:24 crc kubenswrapper[4757]: I1216 13:16:24.046839 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-44ee-account-create-update-9ql9v"] Dec 16 13:16:24 crc kubenswrapper[4757]: I1216 13:16:24.056667 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-35f9-account-create-update-8lns5"] Dec 16 13:16:24 crc kubenswrapper[4757]: I1216 13:16:24.065027 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-44ee-account-create-update-9ql9v"] Dec 16 13:16:24 crc kubenswrapper[4757]: I1216 13:16:24.959968 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="292c184a-83d5-46a5-a309-f72088414fe7" path="/var/lib/kubelet/pods/292c184a-83d5-46a5-a309-f72088414fe7/volumes" Dec 16 13:16:24 crc kubenswrapper[4757]: I1216 13:16:24.960634 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39f33d4c-7376-4b9c-bb1f-192bcefc1b77" path="/var/lib/kubelet/pods/39f33d4c-7376-4b9c-bb1f-192bcefc1b77/volumes" Dec 16 13:16:24 crc kubenswrapper[4757]: I1216 13:16:24.961406 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f76f2d4-df5a-46a3-8a3f-7c326a48b045" path="/var/lib/kubelet/pods/7f76f2d4-df5a-46a3-8a3f-7c326a48b045/volumes" Dec 16 13:16:24 crc kubenswrapper[4757]: I1216 13:16:24.962301 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8" path="/var/lib/kubelet/pods/9f6a3c19-82eb-4afc-9a84-ee5606fdd0d8/volumes" Dec 16 13:16:24 crc kubenswrapper[4757]: I1216 13:16:24.963502 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab469031-4907-4d0c-b47f-5c34d3af3858" path="/var/lib/kubelet/pods/ab469031-4907-4d0c-b47f-5c34d3af3858/volumes" Dec 16 13:16:24 crc kubenswrapper[4757]: I1216 13:16:24.964147 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1627bec-94b0-4d6b-bbd4-178fa53884ff" path="/var/lib/kubelet/pods/c1627bec-94b0-4d6b-bbd4-178fa53884ff/volumes" Dec 16 13:16:28 crc kubenswrapper[4757]: I1216 13:16:28.595732 4757 scope.go:117] "RemoveContainer" containerID="1755d59f079f71e8e90e60d6f83d6f422fb8342e1841643a23ba8672a74519ec" Dec 16 13:16:28 crc kubenswrapper[4757]: I1216 13:16:28.621160 4757 scope.go:117] "RemoveContainer" containerID="ac636fd6af82484a667c4ce5ebb86d03b641b98c47f5e8c9d104671049c90c15" Dec 16 13:16:28 crc kubenswrapper[4757]: I1216 13:16:28.678347 4757 scope.go:117] "RemoveContainer" containerID="0beebbc3b3fac544c9c5650a3b010b51648dba01671ca67035230d4eb4e8dfff" Dec 16 13:16:28 crc kubenswrapper[4757]: I1216 13:16:28.748156 4757 scope.go:117] "RemoveContainer" containerID="a1d23ada33990caf50e3e437704cff1d9bb4630f199e164c03f149247b429bdc" Dec 16 13:16:28 crc kubenswrapper[4757]: I1216 13:16:28.767585 4757 scope.go:117] "RemoveContainer" containerID="88acc2605ef2abf8cf35f0edf60f1ac278643cd7f4d05fdb449bb7358e0220e3" Dec 16 13:16:28 crc kubenswrapper[4757]: I1216 13:16:28.831880 4757 scope.go:117] "RemoveContainer" containerID="b366a7ada5cd808a8900f31f40d9df9d14af61a6afc7b1fd95ff8267be6c1587" Dec 16 13:16:28 crc kubenswrapper[4757]: I1216 13:16:28.864659 4757 scope.go:117] "RemoveContainer" containerID="f1521b3d76b4ef6cc4257efa3cbe8cae7495e08dc0aae39be9035d94770a0208" Dec 16 13:16:28 crc kubenswrapper[4757]: I1216 13:16:28.885791 4757 scope.go:117] "RemoveContainer" containerID="0dc4088b46647d69c1b53bfade8a66fc785b067a8c02a37380ba11d296c46ec0" Dec 16 13:16:28 crc kubenswrapper[4757]: I1216 13:16:28.906833 4757 scope.go:117] "RemoveContainer" containerID="5431d1e9740d00772d8917012d0e343e3e5d7f745e54749265d56f18fe08f951" Dec 16 13:16:28 crc kubenswrapper[4757]: I1216 13:16:28.935661 4757 scope.go:117] "RemoveContainer" containerID="7fd4c00449672da9dd83012aef5cd0874985ae6bcb5c984d7c347396768c76eb" Dec 16 13:16:28 crc kubenswrapper[4757]: I1216 13:16:28.961928 4757 scope.go:117] "RemoveContainer" containerID="c9672ed0fc0283c34eb8f447496c110a39f112477587937bcee8fff0821188b6" Dec 16 13:16:28 crc kubenswrapper[4757]: I1216 13:16:28.982263 4757 scope.go:117] "RemoveContainer" containerID="f4d4a6e607675bd63518611423ac412afad093a7c3b4236d3a90d1089e4973c6" Dec 16 13:16:29 crc kubenswrapper[4757]: I1216 13:16:29.008230 4757 scope.go:117] "RemoveContainer" containerID="d231d340be5da33f9ad59b43d837ac9ddeb1a295101f8566161166a846910f49" Dec 16 13:16:34 crc kubenswrapper[4757]: I1216 13:16:34.975620 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:16:34 crc kubenswrapper[4757]: E1216 13:16:34.976433 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:16:37 crc kubenswrapper[4757]: I1216 13:16:37.043479 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-j8dhl"] Dec 16 13:16:37 crc kubenswrapper[4757]: I1216 13:16:37.052886 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-j8dhl"] Dec 16 13:16:38 crc kubenswrapper[4757]: I1216 13:16:38.964391 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5a76784-332d-479e-9cce-1f5acb1e828f" path="/var/lib/kubelet/pods/b5a76784-332d-479e-9cce-1f5acb1e828f/volumes" Dec 16 13:16:40 crc kubenswrapper[4757]: I1216 13:16:40.031179 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-54js6"] Dec 16 13:16:40 crc kubenswrapper[4757]: I1216 13:16:40.039828 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-54js6"] Dec 16 13:16:40 crc kubenswrapper[4757]: I1216 13:16:40.963851 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe9cebdb-26c9-4618-9640-5e17d5976d12" path="/var/lib/kubelet/pods/fe9cebdb-26c9-4618-9640-5e17d5976d12/volumes" Dec 16 13:16:45 crc kubenswrapper[4757]: I1216 13:16:45.948950 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:16:45 crc kubenswrapper[4757]: E1216 13:16:45.949545 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:16:59 crc kubenswrapper[4757]: I1216 13:16:59.949422 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:17:00 crc kubenswrapper[4757]: I1216 13:17:00.181538 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"eca1d98db7f374bb43377a0b9dbe9ada84f08b48c42f15512ead147338cbbe40"} Dec 16 13:17:29 crc kubenswrapper[4757]: I1216 13:17:29.283773 4757 scope.go:117] "RemoveContainer" containerID="91f4d3761e0ea7dc6f6f468b0c0462a930abcff747338f88d16a9ab8bc8d6512" Dec 16 13:17:29 crc kubenswrapper[4757]: I1216 13:17:29.313284 4757 scope.go:117] "RemoveContainer" containerID="6e41157660f62a54e1cee8244fb66a8511e105080f30e5e8a7cdb0bfb294e497" Dec 16 13:17:43 crc kubenswrapper[4757]: I1216 13:17:43.556895 4757 generic.go:334] "Generic (PLEG): container finished" podID="94b4d3d7-3488-45fa-bbeb-894a4bb55ca1" containerID="e00beda87a0b8890f15a52308c914d6f37cc5d212a823eb673fc6fa4815e6c8a" exitCode=0 Dec 16 13:17:43 crc kubenswrapper[4757]: I1216 13:17:43.556966 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" event={"ID":"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1","Type":"ContainerDied","Data":"e00beda87a0b8890f15a52308c914d6f37cc5d212a823eb673fc6fa4815e6c8a"} Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.012042 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.161705 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-ssh-key\") pod \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\" (UID: \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\") " Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.161808 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-bootstrap-combined-ca-bundle\") pod \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\" (UID: \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\") " Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.161984 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-inventory\") pod \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\" (UID: \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\") " Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.162046 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jbxn\" (UniqueName: \"kubernetes.io/projected/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-kube-api-access-7jbxn\") pod \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\" (UID: \"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1\") " Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.169413 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "94b4d3d7-3488-45fa-bbeb-894a4bb55ca1" (UID: "94b4d3d7-3488-45fa-bbeb-894a4bb55ca1"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.170158 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-kube-api-access-7jbxn" (OuterVolumeSpecName: "kube-api-access-7jbxn") pod "94b4d3d7-3488-45fa-bbeb-894a4bb55ca1" (UID: "94b4d3d7-3488-45fa-bbeb-894a4bb55ca1"). InnerVolumeSpecName "kube-api-access-7jbxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.195286 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-inventory" (OuterVolumeSpecName: "inventory") pod "94b4d3d7-3488-45fa-bbeb-894a4bb55ca1" (UID: "94b4d3d7-3488-45fa-bbeb-894a4bb55ca1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.197221 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "94b4d3d7-3488-45fa-bbeb-894a4bb55ca1" (UID: "94b4d3d7-3488-45fa-bbeb-894a4bb55ca1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.265546 4757 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.265581 4757 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.265591 4757 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.265600 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jbxn\" (UniqueName: \"kubernetes.io/projected/94b4d3d7-3488-45fa-bbeb-894a4bb55ca1-kube-api-access-7jbxn\") on node \"crc\" DevicePath \"\"" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.579692 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" event={"ID":"94b4d3d7-3488-45fa-bbeb-894a4bb55ca1","Type":"ContainerDied","Data":"24b09a47238f432f401780499954c7adef079756d1073e9378b18c9ebfd227cd"} Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.579737 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24b09a47238f432f401780499954c7adef079756d1073e9378b18c9ebfd227cd" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.579789 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.676375 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm"] Dec 16 13:17:45 crc kubenswrapper[4757]: E1216 13:17:45.676785 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b4d3d7-3488-45fa-bbeb-894a4bb55ca1" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.676800 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b4d3d7-3488-45fa-bbeb-894a4bb55ca1" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 16 13:17:45 crc kubenswrapper[4757]: E1216 13:17:45.676826 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="692db28f-b3cf-43b3-8822-fc5898543119" containerName="collect-profiles" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.676832 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="692db28f-b3cf-43b3-8822-fc5898543119" containerName="collect-profiles" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.677019 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="692db28f-b3cf-43b3-8822-fc5898543119" containerName="collect-profiles" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.677040 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="94b4d3d7-3488-45fa-bbeb-894a4bb55ca1" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.677613 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.679560 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.683134 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.683428 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.683593 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.722468 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm"] Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.876531 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74f0d526-ef23-47fd-b475-6f799fd57ba5-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm\" (UID: \"74f0d526-ef23-47fd-b475-6f799fd57ba5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.877149 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gglzh\" (UniqueName: \"kubernetes.io/projected/74f0d526-ef23-47fd-b475-6f799fd57ba5-kube-api-access-gglzh\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm\" (UID: \"74f0d526-ef23-47fd-b475-6f799fd57ba5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.877294 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74f0d526-ef23-47fd-b475-6f799fd57ba5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm\" (UID: \"74f0d526-ef23-47fd-b475-6f799fd57ba5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.979577 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74f0d526-ef23-47fd-b475-6f799fd57ba5-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm\" (UID: \"74f0d526-ef23-47fd-b475-6f799fd57ba5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.979632 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gglzh\" (UniqueName: \"kubernetes.io/projected/74f0d526-ef23-47fd-b475-6f799fd57ba5-kube-api-access-gglzh\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm\" (UID: \"74f0d526-ef23-47fd-b475-6f799fd57ba5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.979688 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74f0d526-ef23-47fd-b475-6f799fd57ba5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm\" (UID: \"74f0d526-ef23-47fd-b475-6f799fd57ba5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.985754 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74f0d526-ef23-47fd-b475-6f799fd57ba5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm\" (UID: \"74f0d526-ef23-47fd-b475-6f799fd57ba5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm" Dec 16 13:17:45 crc kubenswrapper[4757]: I1216 13:17:45.985910 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74f0d526-ef23-47fd-b475-6f799fd57ba5-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm\" (UID: \"74f0d526-ef23-47fd-b475-6f799fd57ba5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm" Dec 16 13:17:46 crc kubenswrapper[4757]: I1216 13:17:46.003689 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gglzh\" (UniqueName: \"kubernetes.io/projected/74f0d526-ef23-47fd-b475-6f799fd57ba5-kube-api-access-gglzh\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm\" (UID: \"74f0d526-ef23-47fd-b475-6f799fd57ba5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm" Dec 16 13:17:46 crc kubenswrapper[4757]: I1216 13:17:46.294905 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm" Dec 16 13:17:46 crc kubenswrapper[4757]: I1216 13:17:46.857515 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm"] Dec 16 13:17:47 crc kubenswrapper[4757]: I1216 13:17:47.598863 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm" event={"ID":"74f0d526-ef23-47fd-b475-6f799fd57ba5","Type":"ContainerStarted","Data":"3c0c1ecb94c84d905ec1eb3034dc766f28e55bbea94ffd9b112047de2c8cb702"} Dec 16 13:17:48 crc kubenswrapper[4757]: I1216 13:17:48.611777 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm" event={"ID":"74f0d526-ef23-47fd-b475-6f799fd57ba5","Type":"ContainerStarted","Data":"79ce89fae9778010d841810ccdaf9d87af2f2e3836f74f858f4fa16f9990bde6"} Dec 16 13:17:48 crc kubenswrapper[4757]: I1216 13:17:48.631433 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm" podStartSLOduration=2.867181385 podStartE2EDuration="3.631409986s" podCreationTimestamp="2025-12-16 13:17:45 +0000 UTC" firstStartedPulling="2025-12-16 13:17:46.867493884 +0000 UTC m=+1852.295237680" lastFinishedPulling="2025-12-16 13:17:47.631722485 +0000 UTC m=+1853.059466281" observedRunningTime="2025-12-16 13:17:48.627501178 +0000 UTC m=+1854.055244974" watchObservedRunningTime="2025-12-16 13:17:48.631409986 +0000 UTC m=+1854.059153792" Dec 16 13:17:49 crc kubenswrapper[4757]: I1216 13:17:49.052595 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kg598"] Dec 16 13:17:49 crc kubenswrapper[4757]: I1216 13:17:49.060625 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kg598"] Dec 16 13:17:50 crc kubenswrapper[4757]: I1216 13:17:50.976023 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="082914b4-7f60-4d23-98ec-51f3c8a831aa" path="/var/lib/kubelet/pods/082914b4-7f60-4d23-98ec-51f3c8a831aa/volumes" Dec 16 13:17:58 crc kubenswrapper[4757]: I1216 13:17:58.030879 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-86cds"] Dec 16 13:17:58 crc kubenswrapper[4757]: I1216 13:17:58.041560 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-86cds"] Dec 16 13:17:58 crc kubenswrapper[4757]: I1216 13:17:58.960518 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d57428-c378-4e57-87c5-f1fff2398cec" path="/var/lib/kubelet/pods/25d57428-c378-4e57-87c5-f1fff2398cec/volumes" Dec 16 13:18:01 crc kubenswrapper[4757]: I1216 13:18:01.047093 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-tjbz5"] Dec 16 13:18:01 crc kubenswrapper[4757]: I1216 13:18:01.057158 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-tjbz5"] Dec 16 13:18:02 crc kubenswrapper[4757]: I1216 13:18:02.959689 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="362eaecb-4139-44f9-a651-3e14cc2d6ae2" path="/var/lib/kubelet/pods/362eaecb-4139-44f9-a651-3e14cc2d6ae2/volumes" Dec 16 13:18:09 crc kubenswrapper[4757]: I1216 13:18:09.039098 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-7lhwd"] Dec 16 13:18:09 crc kubenswrapper[4757]: I1216 13:18:09.046644 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-pcl59"] Dec 16 13:18:09 crc kubenswrapper[4757]: I1216 13:18:09.063770 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-pcl59"] Dec 16 13:18:09 crc kubenswrapper[4757]: I1216 13:18:09.076526 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-7lhwd"] Dec 16 13:18:10 crc kubenswrapper[4757]: I1216 13:18:10.962325 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e48bb858-bd6e-4dbc-a17a-5fd5e1275e00" path="/var/lib/kubelet/pods/e48bb858-bd6e-4dbc-a17a-5fd5e1275e00/volumes" Dec 16 13:18:10 crc kubenswrapper[4757]: I1216 13:18:10.963915 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc9a7054-7c7f-4e36-8d57-e095087a7878" path="/var/lib/kubelet/pods/fc9a7054-7c7f-4e36-8d57-e095087a7878/volumes" Dec 16 13:18:29 crc kubenswrapper[4757]: I1216 13:18:29.406843 4757 scope.go:117] "RemoveContainer" containerID="e432176b39a460d47e7d77399b0f4c007df8990074a5a85f35961c0774cacecc" Dec 16 13:18:29 crc kubenswrapper[4757]: I1216 13:18:29.440879 4757 scope.go:117] "RemoveContainer" containerID="0e351d201b95334e733de4822ee4dfa2d43cbb85bf172b683744b901cb0cd0e8" Dec 16 13:18:29 crc kubenswrapper[4757]: I1216 13:18:29.521728 4757 scope.go:117] "RemoveContainer" containerID="ad9d80b9af09d564a6151170db5443c56272b5afdd1aefa55ddbbd775f41a0aa" Dec 16 13:18:29 crc kubenswrapper[4757]: I1216 13:18:29.559594 4757 scope.go:117] "RemoveContainer" containerID="5e198d54c33d19283c1b227ff4c28be1278e25af8630e271c1f46e46d1980127" Dec 16 13:18:29 crc kubenswrapper[4757]: I1216 13:18:29.599809 4757 scope.go:117] "RemoveContainer" containerID="f20ef5e006c72abb7a0a7a3f3c95380816bf64afd8ac9ee819347f5406e6e010" Dec 16 13:18:29 crc kubenswrapper[4757]: I1216 13:18:29.639027 4757 scope.go:117] "RemoveContainer" containerID="de39c9dae387c9a7a0a0337645a1f8fda98b3c76c812fc6e19ebb15643aca6d7" Dec 16 13:18:29 crc kubenswrapper[4757]: I1216 13:18:29.689782 4757 scope.go:117] "RemoveContainer" containerID="ab24c6ca0319089d7a90e87725aa3faf795b768c3f535f3c5628e6e3b7002fca" Dec 16 13:18:56 crc kubenswrapper[4757]: I1216 13:18:56.052138 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-4plp4"] Dec 16 13:18:56 crc kubenswrapper[4757]: I1216 13:18:56.064692 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-4plp4"] Dec 16 13:18:56 crc kubenswrapper[4757]: I1216 13:18:56.958702 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="956cce89-a20b-448a-9cf2-16b3ddcafe10" path="/var/lib/kubelet/pods/956cce89-a20b-448a-9cf2-16b3ddcafe10/volumes" Dec 16 13:18:57 crc kubenswrapper[4757]: I1216 13:18:57.028151 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-4656-account-create-update-jztlz"] Dec 16 13:18:57 crc kubenswrapper[4757]: I1216 13:18:57.036690 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-bcd7-account-create-update-59qnr"] Dec 16 13:18:57 crc kubenswrapper[4757]: I1216 13:18:57.046281 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-4656-account-create-update-jztlz"] Dec 16 13:18:57 crc kubenswrapper[4757]: I1216 13:18:57.055832 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-bcd7-account-create-update-59qnr"] Dec 16 13:18:58 crc kubenswrapper[4757]: I1216 13:18:58.036799 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-q6drx"] Dec 16 13:18:58 crc kubenswrapper[4757]: I1216 13:18:58.052670 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-lhdf9"] Dec 16 13:18:58 crc kubenswrapper[4757]: I1216 13:18:58.064312 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-44ce-account-create-update-54r5x"] Dec 16 13:18:58 crc kubenswrapper[4757]: I1216 13:18:58.074230 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-q6drx"] Dec 16 13:18:58 crc kubenswrapper[4757]: I1216 13:18:58.082613 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-44ce-account-create-update-54r5x"] Dec 16 13:18:58 crc kubenswrapper[4757]: I1216 13:18:58.093198 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-lhdf9"] Dec 16 13:18:58 crc kubenswrapper[4757]: I1216 13:18:58.959420 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1488e9fc-9b52-4515-8fb3-980469c83ae8" path="/var/lib/kubelet/pods/1488e9fc-9b52-4515-8fb3-980469c83ae8/volumes" Dec 16 13:18:58 crc kubenswrapper[4757]: I1216 13:18:58.962088 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="427479ca-18b6-4580-a4ef-f85bb4071c88" path="/var/lib/kubelet/pods/427479ca-18b6-4580-a4ef-f85bb4071c88/volumes" Dec 16 13:18:58 crc kubenswrapper[4757]: I1216 13:18:58.962896 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87dfa09a-c8bd-4861-b45f-7574d8295fa1" path="/var/lib/kubelet/pods/87dfa09a-c8bd-4861-b45f-7574d8295fa1/volumes" Dec 16 13:18:58 crc kubenswrapper[4757]: I1216 13:18:58.963889 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9da576af-ae63-447b-acaf-a6a7bfe96ddb" path="/var/lib/kubelet/pods/9da576af-ae63-447b-acaf-a6a7bfe96ddb/volumes" Dec 16 13:18:58 crc kubenswrapper[4757]: I1216 13:18:58.965888 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efe43680-5772-43bd-9ca1-4cb0245cae49" path="/var/lib/kubelet/pods/efe43680-5772-43bd-9ca1-4cb0245cae49/volumes" Dec 16 13:19:21 crc kubenswrapper[4757]: I1216 13:19:21.181461 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:19:21 crc kubenswrapper[4757]: I1216 13:19:21.182267 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:19:29 crc kubenswrapper[4757]: I1216 13:19:29.904930 4757 scope.go:117] "RemoveContainer" containerID="16dd2217e5ae1a2315f7849fcd38180ee50184b219ec945589504ceef408ad24" Dec 16 13:19:29 crc kubenswrapper[4757]: I1216 13:19:29.948617 4757 scope.go:117] "RemoveContainer" containerID="3a9ae0ea5a868d4ac291a5f13cd7caa7114786cb1777cd6c48f7c56766b75279" Dec 16 13:19:29 crc kubenswrapper[4757]: I1216 13:19:29.980338 4757 scope.go:117] "RemoveContainer" containerID="ada0d0eef689644fe20216a8d2fabb8480020fef64c026e5ab8fc518b1e979dc" Dec 16 13:19:30 crc kubenswrapper[4757]: I1216 13:19:30.035961 4757 scope.go:117] "RemoveContainer" containerID="1dc304aad474c8858594e0ddc43429ed0e67764892182e8ac4de358acd80179a" Dec 16 13:19:30 crc kubenswrapper[4757]: I1216 13:19:30.076545 4757 scope.go:117] "RemoveContainer" containerID="8f051f56e4baa5e1cfb82dbd4d7f6c70a9a250f0460f2b330b0f30c6ca589ac3" Dec 16 13:19:30 crc kubenswrapper[4757]: I1216 13:19:30.127944 4757 scope.go:117] "RemoveContainer" containerID="4144e02fa39fd42cb957289095adc59adb01dd6b67c1c6acb9898c1c3cd3026f" Dec 16 13:19:51 crc kubenswrapper[4757]: I1216 13:19:51.181951 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:19:51 crc kubenswrapper[4757]: I1216 13:19:51.182693 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:20:05 crc kubenswrapper[4757]: I1216 13:20:05.744104 4757 generic.go:334] "Generic (PLEG): container finished" podID="74f0d526-ef23-47fd-b475-6f799fd57ba5" containerID="79ce89fae9778010d841810ccdaf9d87af2f2e3836f74f858f4fa16f9990bde6" exitCode=0 Dec 16 13:20:05 crc kubenswrapper[4757]: I1216 13:20:05.744188 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm" event={"ID":"74f0d526-ef23-47fd-b475-6f799fd57ba5","Type":"ContainerDied","Data":"79ce89fae9778010d841810ccdaf9d87af2f2e3836f74f858f4fa16f9990bde6"} Dec 16 13:20:06 crc kubenswrapper[4757]: I1216 13:20:06.097953 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bxc8x"] Dec 16 13:20:06 crc kubenswrapper[4757]: I1216 13:20:06.124274 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bxc8x"] Dec 16 13:20:06 crc kubenswrapper[4757]: I1216 13:20:06.965582 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a6fdfdb-145b-460e-b8e9-9f44e9034f40" path="/var/lib/kubelet/pods/8a6fdfdb-145b-460e-b8e9-9f44e9034f40/volumes" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.197080 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.262709 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74f0d526-ef23-47fd-b475-6f799fd57ba5-inventory\") pod \"74f0d526-ef23-47fd-b475-6f799fd57ba5\" (UID: \"74f0d526-ef23-47fd-b475-6f799fd57ba5\") " Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.263016 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74f0d526-ef23-47fd-b475-6f799fd57ba5-ssh-key\") pod \"74f0d526-ef23-47fd-b475-6f799fd57ba5\" (UID: \"74f0d526-ef23-47fd-b475-6f799fd57ba5\") " Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.263095 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gglzh\" (UniqueName: \"kubernetes.io/projected/74f0d526-ef23-47fd-b475-6f799fd57ba5-kube-api-access-gglzh\") pod \"74f0d526-ef23-47fd-b475-6f799fd57ba5\" (UID: \"74f0d526-ef23-47fd-b475-6f799fd57ba5\") " Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.270501 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74f0d526-ef23-47fd-b475-6f799fd57ba5-kube-api-access-gglzh" (OuterVolumeSpecName: "kube-api-access-gglzh") pod "74f0d526-ef23-47fd-b475-6f799fd57ba5" (UID: "74f0d526-ef23-47fd-b475-6f799fd57ba5"). InnerVolumeSpecName "kube-api-access-gglzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.290153 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f0d526-ef23-47fd-b475-6f799fd57ba5-inventory" (OuterVolumeSpecName: "inventory") pod "74f0d526-ef23-47fd-b475-6f799fd57ba5" (UID: "74f0d526-ef23-47fd-b475-6f799fd57ba5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.290488 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f0d526-ef23-47fd-b475-6f799fd57ba5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "74f0d526-ef23-47fd-b475-6f799fd57ba5" (UID: "74f0d526-ef23-47fd-b475-6f799fd57ba5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.364779 4757 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74f0d526-ef23-47fd-b475-6f799fd57ba5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.364822 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gglzh\" (UniqueName: \"kubernetes.io/projected/74f0d526-ef23-47fd-b475-6f799fd57ba5-kube-api-access-gglzh\") on node \"crc\" DevicePath \"\"" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.364836 4757 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74f0d526-ef23-47fd-b475-6f799fd57ba5-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.761950 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm" event={"ID":"74f0d526-ef23-47fd-b475-6f799fd57ba5","Type":"ContainerDied","Data":"3c0c1ecb94c84d905ec1eb3034dc766f28e55bbea94ffd9b112047de2c8cb702"} Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.761995 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c0c1ecb94c84d905ec1eb3034dc766f28e55bbea94ffd9b112047de2c8cb702" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.762070 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.900722 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25"] Dec 16 13:20:07 crc kubenswrapper[4757]: E1216 13:20:07.901351 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f0d526-ef23-47fd-b475-6f799fd57ba5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.901461 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f0d526-ef23-47fd-b475-6f799fd57ba5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.901682 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="74f0d526-ef23-47fd-b475-6f799fd57ba5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.902369 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.904451 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.905963 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.906578 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.922567 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.924070 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25"] Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.978238 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b04c40b-fcee-4a0c-b5d0-c994f3fd138e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kth25\" (UID: \"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.980191 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shfts\" (UniqueName: \"kubernetes.io/projected/0b04c40b-fcee-4a0c-b5d0-c994f3fd138e-kube-api-access-shfts\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kth25\" (UID: \"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25" Dec 16 13:20:07 crc kubenswrapper[4757]: I1216 13:20:07.980672 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b04c40b-fcee-4a0c-b5d0-c994f3fd138e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kth25\" (UID: \"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25" Dec 16 13:20:08 crc kubenswrapper[4757]: I1216 13:20:08.082334 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shfts\" (UniqueName: \"kubernetes.io/projected/0b04c40b-fcee-4a0c-b5d0-c994f3fd138e-kube-api-access-shfts\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kth25\" (UID: \"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25" Dec 16 13:20:08 crc kubenswrapper[4757]: I1216 13:20:08.082450 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b04c40b-fcee-4a0c-b5d0-c994f3fd138e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kth25\" (UID: \"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25" Dec 16 13:20:08 crc kubenswrapper[4757]: I1216 13:20:08.082525 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b04c40b-fcee-4a0c-b5d0-c994f3fd138e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kth25\" (UID: \"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25" Dec 16 13:20:08 crc kubenswrapper[4757]: I1216 13:20:08.090758 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b04c40b-fcee-4a0c-b5d0-c994f3fd138e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kth25\" (UID: \"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25" Dec 16 13:20:08 crc kubenswrapper[4757]: I1216 13:20:08.093417 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b04c40b-fcee-4a0c-b5d0-c994f3fd138e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kth25\" (UID: \"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25" Dec 16 13:20:08 crc kubenswrapper[4757]: I1216 13:20:08.099656 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shfts\" (UniqueName: \"kubernetes.io/projected/0b04c40b-fcee-4a0c-b5d0-c994f3fd138e-kube-api-access-shfts\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kth25\" (UID: \"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25" Dec 16 13:20:08 crc kubenswrapper[4757]: I1216 13:20:08.223571 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25" Dec 16 13:20:08 crc kubenswrapper[4757]: I1216 13:20:08.777869 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25"] Dec 16 13:20:08 crc kubenswrapper[4757]: I1216 13:20:08.781547 4757 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 13:20:09 crc kubenswrapper[4757]: I1216 13:20:09.785594 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25" event={"ID":"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e","Type":"ContainerStarted","Data":"7d717898c902f6f28862e240d8661f8a9f354c0537f4303417a49f20e6e1962d"} Dec 16 13:20:09 crc kubenswrapper[4757]: I1216 13:20:09.785922 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25" event={"ID":"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e","Type":"ContainerStarted","Data":"2cccfefb6af36116bdb20e109275af2e29e6aa1414656cfceb57b12c1f881c75"} Dec 16 13:20:09 crc kubenswrapper[4757]: I1216 13:20:09.816426 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25" podStartSLOduration=2.3366361319999998 podStartE2EDuration="2.816408138s" podCreationTimestamp="2025-12-16 13:20:07 +0000 UTC" firstStartedPulling="2025-12-16 13:20:08.780845014 +0000 UTC m=+1994.208588810" lastFinishedPulling="2025-12-16 13:20:09.26061702 +0000 UTC m=+1994.688360816" observedRunningTime="2025-12-16 13:20:09.810477919 +0000 UTC m=+1995.238221725" watchObservedRunningTime="2025-12-16 13:20:09.816408138 +0000 UTC m=+1995.244151934" Dec 16 13:20:21 crc kubenswrapper[4757]: I1216 13:20:21.181360 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:20:21 crc kubenswrapper[4757]: I1216 13:20:21.181793 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:20:21 crc kubenswrapper[4757]: I1216 13:20:21.181832 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 13:20:21 crc kubenswrapper[4757]: I1216 13:20:21.182485 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eca1d98db7f374bb43377a0b9dbe9ada84f08b48c42f15512ead147338cbbe40"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 13:20:21 crc kubenswrapper[4757]: I1216 13:20:21.182536 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://eca1d98db7f374bb43377a0b9dbe9ada84f08b48c42f15512ead147338cbbe40" gracePeriod=600 Dec 16 13:20:21 crc kubenswrapper[4757]: I1216 13:20:21.886459 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="eca1d98db7f374bb43377a0b9dbe9ada84f08b48c42f15512ead147338cbbe40" exitCode=0 Dec 16 13:20:21 crc kubenswrapper[4757]: I1216 13:20:21.886659 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"eca1d98db7f374bb43377a0b9dbe9ada84f08b48c42f15512ead147338cbbe40"} Dec 16 13:20:21 crc kubenswrapper[4757]: I1216 13:20:21.887028 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf"} Dec 16 13:20:21 crc kubenswrapper[4757]: I1216 13:20:21.887050 4757 scope.go:117] "RemoveContainer" containerID="ac2941f3066a5cd8e1a4fb484e299750f27fdd9686bf1c41531159abf0a500d3" Dec 16 13:20:30 crc kubenswrapper[4757]: I1216 13:20:30.248748 4757 scope.go:117] "RemoveContainer" containerID="4c2346231f5aa5a3f75291af9b526f1c18d54ecfa10f7c241ed4613de9179a22" Dec 16 13:21:12 crc kubenswrapper[4757]: I1216 13:21:12.049380 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-b4m4q"] Dec 16 13:21:12 crc kubenswrapper[4757]: I1216 13:21:12.058051 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-b4m4q"] Dec 16 13:21:12 crc kubenswrapper[4757]: I1216 13:21:12.962879 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32ea5e67-160d-47fd-9bb3-70141a4bcdb1" path="/var/lib/kubelet/pods/32ea5e67-160d-47fd-9bb3-70141a4bcdb1/volumes" Dec 16 13:21:13 crc kubenswrapper[4757]: I1216 13:21:13.031416 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hwttf"] Dec 16 13:21:13 crc kubenswrapper[4757]: I1216 13:21:13.041311 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hwttf"] Dec 16 13:21:14 crc kubenswrapper[4757]: I1216 13:21:14.963957 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c6b1bba-b68a-4912-aada-0229a7152426" path="/var/lib/kubelet/pods/1c6b1bba-b68a-4912-aada-0229a7152426/volumes" Dec 16 13:21:15 crc kubenswrapper[4757]: I1216 13:21:15.162112 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nnthc"] Dec 16 13:21:15 crc kubenswrapper[4757]: I1216 13:21:15.165057 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnthc" Dec 16 13:21:15 crc kubenswrapper[4757]: I1216 13:21:15.173648 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nnthc"] Dec 16 13:21:15 crc kubenswrapper[4757]: I1216 13:21:15.303249 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b3b224-faae-4ab0-af23-38fd813ab25c-utilities\") pod \"redhat-operators-nnthc\" (UID: \"66b3b224-faae-4ab0-af23-38fd813ab25c\") " pod="openshift-marketplace/redhat-operators-nnthc" Dec 16 13:21:15 crc kubenswrapper[4757]: I1216 13:21:15.303299 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b3b224-faae-4ab0-af23-38fd813ab25c-catalog-content\") pod \"redhat-operators-nnthc\" (UID: \"66b3b224-faae-4ab0-af23-38fd813ab25c\") " pod="openshift-marketplace/redhat-operators-nnthc" Dec 16 13:21:15 crc kubenswrapper[4757]: I1216 13:21:15.303346 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mphs2\" (UniqueName: \"kubernetes.io/projected/66b3b224-faae-4ab0-af23-38fd813ab25c-kube-api-access-mphs2\") pod \"redhat-operators-nnthc\" (UID: \"66b3b224-faae-4ab0-af23-38fd813ab25c\") " pod="openshift-marketplace/redhat-operators-nnthc" Dec 16 13:21:15 crc kubenswrapper[4757]: I1216 13:21:15.405495 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b3b224-faae-4ab0-af23-38fd813ab25c-utilities\") pod \"redhat-operators-nnthc\" (UID: \"66b3b224-faae-4ab0-af23-38fd813ab25c\") " pod="openshift-marketplace/redhat-operators-nnthc" Dec 16 13:21:15 crc kubenswrapper[4757]: I1216 13:21:15.405541 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b3b224-faae-4ab0-af23-38fd813ab25c-catalog-content\") pod \"redhat-operators-nnthc\" (UID: \"66b3b224-faae-4ab0-af23-38fd813ab25c\") " pod="openshift-marketplace/redhat-operators-nnthc" Dec 16 13:21:15 crc kubenswrapper[4757]: I1216 13:21:15.405614 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mphs2\" (UniqueName: \"kubernetes.io/projected/66b3b224-faae-4ab0-af23-38fd813ab25c-kube-api-access-mphs2\") pod \"redhat-operators-nnthc\" (UID: \"66b3b224-faae-4ab0-af23-38fd813ab25c\") " pod="openshift-marketplace/redhat-operators-nnthc" Dec 16 13:21:15 crc kubenswrapper[4757]: I1216 13:21:15.406313 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b3b224-faae-4ab0-af23-38fd813ab25c-utilities\") pod \"redhat-operators-nnthc\" (UID: \"66b3b224-faae-4ab0-af23-38fd813ab25c\") " pod="openshift-marketplace/redhat-operators-nnthc" Dec 16 13:21:15 crc kubenswrapper[4757]: I1216 13:21:15.406540 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b3b224-faae-4ab0-af23-38fd813ab25c-catalog-content\") pod \"redhat-operators-nnthc\" (UID: \"66b3b224-faae-4ab0-af23-38fd813ab25c\") " pod="openshift-marketplace/redhat-operators-nnthc" Dec 16 13:21:15 crc kubenswrapper[4757]: I1216 13:21:15.458396 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mphs2\" (UniqueName: \"kubernetes.io/projected/66b3b224-faae-4ab0-af23-38fd813ab25c-kube-api-access-mphs2\") pod \"redhat-operators-nnthc\" (UID: \"66b3b224-faae-4ab0-af23-38fd813ab25c\") " pod="openshift-marketplace/redhat-operators-nnthc" Dec 16 13:21:15 crc kubenswrapper[4757]: I1216 13:21:15.498551 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnthc" Dec 16 13:21:16 crc kubenswrapper[4757]: I1216 13:21:16.259197 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nnthc"] Dec 16 13:21:16 crc kubenswrapper[4757]: I1216 13:21:16.331253 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnthc" event={"ID":"66b3b224-faae-4ab0-af23-38fd813ab25c","Type":"ContainerStarted","Data":"50f78f2648361edd916fd4427bb50786f98045427708775d0f6c36e6aed80792"} Dec 16 13:21:17 crc kubenswrapper[4757]: I1216 13:21:17.341662 4757 generic.go:334] "Generic (PLEG): container finished" podID="66b3b224-faae-4ab0-af23-38fd813ab25c" containerID="bd8b1488f5b9a390d378840a204ae2ccadc58d0fd3a8000cceaf5cce88f337b4" exitCode=0 Dec 16 13:21:17 crc kubenswrapper[4757]: I1216 13:21:17.341741 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnthc" event={"ID":"66b3b224-faae-4ab0-af23-38fd813ab25c","Type":"ContainerDied","Data":"bd8b1488f5b9a390d378840a204ae2ccadc58d0fd3a8000cceaf5cce88f337b4"} Dec 16 13:21:18 crc kubenswrapper[4757]: I1216 13:21:18.356609 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnthc" event={"ID":"66b3b224-faae-4ab0-af23-38fd813ab25c","Type":"ContainerStarted","Data":"b20486bb0ae4b796e962146c7a7042932ce04233588363a381f5614faad85703"} Dec 16 13:21:22 crc kubenswrapper[4757]: I1216 13:21:22.390630 4757 generic.go:334] "Generic (PLEG): container finished" podID="66b3b224-faae-4ab0-af23-38fd813ab25c" containerID="b20486bb0ae4b796e962146c7a7042932ce04233588363a381f5614faad85703" exitCode=0 Dec 16 13:21:22 crc kubenswrapper[4757]: I1216 13:21:22.390699 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnthc" event={"ID":"66b3b224-faae-4ab0-af23-38fd813ab25c","Type":"ContainerDied","Data":"b20486bb0ae4b796e962146c7a7042932ce04233588363a381f5614faad85703"} Dec 16 13:21:24 crc kubenswrapper[4757]: I1216 13:21:24.433850 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnthc" event={"ID":"66b3b224-faae-4ab0-af23-38fd813ab25c","Type":"ContainerStarted","Data":"1cfd6f061362606dc6338d05f458c56c18ed88c5d9790aaa66965c94c4a5e47d"} Dec 16 13:21:24 crc kubenswrapper[4757]: I1216 13:21:24.467172 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nnthc" podStartSLOduration=3.498321908 podStartE2EDuration="9.467150908s" podCreationTimestamp="2025-12-16 13:21:15 +0000 UTC" firstStartedPulling="2025-12-16 13:21:17.343742404 +0000 UTC m=+2062.771486200" lastFinishedPulling="2025-12-16 13:21:23.312571404 +0000 UTC m=+2068.740315200" observedRunningTime="2025-12-16 13:21:24.4644626 +0000 UTC m=+2069.892206406" watchObservedRunningTime="2025-12-16 13:21:24.467150908 +0000 UTC m=+2069.894894714" Dec 16 13:21:25 crc kubenswrapper[4757]: I1216 13:21:25.499946 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nnthc" Dec 16 13:21:25 crc kubenswrapper[4757]: I1216 13:21:25.500293 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nnthc" Dec 16 13:21:26 crc kubenswrapper[4757]: I1216 13:21:26.543581 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nnthc" podUID="66b3b224-faae-4ab0-af23-38fd813ab25c" containerName="registry-server" probeResult="failure" output=< Dec 16 13:21:26 crc kubenswrapper[4757]: timeout: failed to connect service ":50051" within 1s Dec 16 13:21:26 crc kubenswrapper[4757]: > Dec 16 13:21:30 crc kubenswrapper[4757]: I1216 13:21:30.384033 4757 scope.go:117] "RemoveContainer" containerID="bb5bac903cac19e8af0a42238e02f346db20cfc7fc52e6187c10601921713636" Dec 16 13:21:30 crc kubenswrapper[4757]: I1216 13:21:30.426188 4757 scope.go:117] "RemoveContainer" containerID="3de9e060dc0c56b825fb723b67ad28f9320965646c9a8d7934c870822718e4e7" Dec 16 13:21:35 crc kubenswrapper[4757]: I1216 13:21:35.556410 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nnthc" Dec 16 13:21:35 crc kubenswrapper[4757]: I1216 13:21:35.606235 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nnthc" Dec 16 13:21:35 crc kubenswrapper[4757]: I1216 13:21:35.807870 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nnthc"] Dec 16 13:21:37 crc kubenswrapper[4757]: I1216 13:21:37.557440 4757 generic.go:334] "Generic (PLEG): container finished" podID="0b04c40b-fcee-4a0c-b5d0-c994f3fd138e" containerID="7d717898c902f6f28862e240d8661f8a9f354c0537f4303417a49f20e6e1962d" exitCode=0 Dec 16 13:21:37 crc kubenswrapper[4757]: I1216 13:21:37.557500 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25" event={"ID":"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e","Type":"ContainerDied","Data":"7d717898c902f6f28862e240d8661f8a9f354c0537f4303417a49f20e6e1962d"} Dec 16 13:21:37 crc kubenswrapper[4757]: I1216 13:21:37.557864 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nnthc" podUID="66b3b224-faae-4ab0-af23-38fd813ab25c" containerName="registry-server" containerID="cri-o://1cfd6f061362606dc6338d05f458c56c18ed88c5d9790aaa66965c94c4a5e47d" gracePeriod=2 Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.043964 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnthc" Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.165707 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b3b224-faae-4ab0-af23-38fd813ab25c-catalog-content\") pod \"66b3b224-faae-4ab0-af23-38fd813ab25c\" (UID: \"66b3b224-faae-4ab0-af23-38fd813ab25c\") " Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.165778 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mphs2\" (UniqueName: \"kubernetes.io/projected/66b3b224-faae-4ab0-af23-38fd813ab25c-kube-api-access-mphs2\") pod \"66b3b224-faae-4ab0-af23-38fd813ab25c\" (UID: \"66b3b224-faae-4ab0-af23-38fd813ab25c\") " Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.165846 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b3b224-faae-4ab0-af23-38fd813ab25c-utilities\") pod \"66b3b224-faae-4ab0-af23-38fd813ab25c\" (UID: \"66b3b224-faae-4ab0-af23-38fd813ab25c\") " Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.166493 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66b3b224-faae-4ab0-af23-38fd813ab25c-utilities" (OuterVolumeSpecName: "utilities") pod "66b3b224-faae-4ab0-af23-38fd813ab25c" (UID: "66b3b224-faae-4ab0-af23-38fd813ab25c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.172268 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66b3b224-faae-4ab0-af23-38fd813ab25c-kube-api-access-mphs2" (OuterVolumeSpecName: "kube-api-access-mphs2") pod "66b3b224-faae-4ab0-af23-38fd813ab25c" (UID: "66b3b224-faae-4ab0-af23-38fd813ab25c"). InnerVolumeSpecName "kube-api-access-mphs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.267827 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mphs2\" (UniqueName: \"kubernetes.io/projected/66b3b224-faae-4ab0-af23-38fd813ab25c-kube-api-access-mphs2\") on node \"crc\" DevicePath \"\"" Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.267872 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b3b224-faae-4ab0-af23-38fd813ab25c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.299309 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66b3b224-faae-4ab0-af23-38fd813ab25c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66b3b224-faae-4ab0-af23-38fd813ab25c" (UID: "66b3b224-faae-4ab0-af23-38fd813ab25c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.369256 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b3b224-faae-4ab0-af23-38fd813ab25c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.567843 4757 generic.go:334] "Generic (PLEG): container finished" podID="66b3b224-faae-4ab0-af23-38fd813ab25c" containerID="1cfd6f061362606dc6338d05f458c56c18ed88c5d9790aaa66965c94c4a5e47d" exitCode=0 Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.567939 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnthc" event={"ID":"66b3b224-faae-4ab0-af23-38fd813ab25c","Type":"ContainerDied","Data":"1cfd6f061362606dc6338d05f458c56c18ed88c5d9790aaa66965c94c4a5e47d"} Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.567989 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnthc" event={"ID":"66b3b224-faae-4ab0-af23-38fd813ab25c","Type":"ContainerDied","Data":"50f78f2648361edd916fd4427bb50786f98045427708775d0f6c36e6aed80792"} Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.568104 4757 scope.go:117] "RemoveContainer" containerID="1cfd6f061362606dc6338d05f458c56c18ed88c5d9790aaa66965c94c4a5e47d" Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.569092 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnthc" Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.609780 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nnthc"] Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.618861 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nnthc"] Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.619908 4757 scope.go:117] "RemoveContainer" containerID="b20486bb0ae4b796e962146c7a7042932ce04233588363a381f5614faad85703" Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.655369 4757 scope.go:117] "RemoveContainer" containerID="bd8b1488f5b9a390d378840a204ae2ccadc58d0fd3a8000cceaf5cce88f337b4" Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.695121 4757 scope.go:117] "RemoveContainer" containerID="1cfd6f061362606dc6338d05f458c56c18ed88c5d9790aaa66965c94c4a5e47d" Dec 16 13:21:38 crc kubenswrapper[4757]: E1216 13:21:38.696911 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cfd6f061362606dc6338d05f458c56c18ed88c5d9790aaa66965c94c4a5e47d\": container with ID starting with 1cfd6f061362606dc6338d05f458c56c18ed88c5d9790aaa66965c94c4a5e47d not found: ID does not exist" containerID="1cfd6f061362606dc6338d05f458c56c18ed88c5d9790aaa66965c94c4a5e47d" Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.697134 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cfd6f061362606dc6338d05f458c56c18ed88c5d9790aaa66965c94c4a5e47d"} err="failed to get container status \"1cfd6f061362606dc6338d05f458c56c18ed88c5d9790aaa66965c94c4a5e47d\": rpc error: code = NotFound desc = could not find container \"1cfd6f061362606dc6338d05f458c56c18ed88c5d9790aaa66965c94c4a5e47d\": container with ID starting with 1cfd6f061362606dc6338d05f458c56c18ed88c5d9790aaa66965c94c4a5e47d not found: ID does not exist" Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.697190 4757 scope.go:117] "RemoveContainer" containerID="b20486bb0ae4b796e962146c7a7042932ce04233588363a381f5614faad85703" Dec 16 13:21:38 crc kubenswrapper[4757]: E1216 13:21:38.697873 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b20486bb0ae4b796e962146c7a7042932ce04233588363a381f5614faad85703\": container with ID starting with b20486bb0ae4b796e962146c7a7042932ce04233588363a381f5614faad85703 not found: ID does not exist" containerID="b20486bb0ae4b796e962146c7a7042932ce04233588363a381f5614faad85703" Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.697897 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b20486bb0ae4b796e962146c7a7042932ce04233588363a381f5614faad85703"} err="failed to get container status \"b20486bb0ae4b796e962146c7a7042932ce04233588363a381f5614faad85703\": rpc error: code = NotFound desc = could not find container \"b20486bb0ae4b796e962146c7a7042932ce04233588363a381f5614faad85703\": container with ID starting with b20486bb0ae4b796e962146c7a7042932ce04233588363a381f5614faad85703 not found: ID does not exist" Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.697916 4757 scope.go:117] "RemoveContainer" containerID="bd8b1488f5b9a390d378840a204ae2ccadc58d0fd3a8000cceaf5cce88f337b4" Dec 16 13:21:38 crc kubenswrapper[4757]: E1216 13:21:38.698256 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8b1488f5b9a390d378840a204ae2ccadc58d0fd3a8000cceaf5cce88f337b4\": container with ID starting with bd8b1488f5b9a390d378840a204ae2ccadc58d0fd3a8000cceaf5cce88f337b4 not found: ID does not exist" containerID="bd8b1488f5b9a390d378840a204ae2ccadc58d0fd3a8000cceaf5cce88f337b4" Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.698294 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8b1488f5b9a390d378840a204ae2ccadc58d0fd3a8000cceaf5cce88f337b4"} err="failed to get container status \"bd8b1488f5b9a390d378840a204ae2ccadc58d0fd3a8000cceaf5cce88f337b4\": rpc error: code = NotFound desc = could not find container \"bd8b1488f5b9a390d378840a204ae2ccadc58d0fd3a8000cceaf5cce88f337b4\": container with ID starting with bd8b1488f5b9a390d378840a204ae2ccadc58d0fd3a8000cceaf5cce88f337b4 not found: ID does not exist" Dec 16 13:21:38 crc kubenswrapper[4757]: I1216 13:21:38.960048 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66b3b224-faae-4ab0-af23-38fd813ab25c" path="/var/lib/kubelet/pods/66b3b224-faae-4ab0-af23-38fd813ab25c/volumes" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.070139 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.190720 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b04c40b-fcee-4a0c-b5d0-c994f3fd138e-inventory\") pod \"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e\" (UID: \"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e\") " Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.190902 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shfts\" (UniqueName: \"kubernetes.io/projected/0b04c40b-fcee-4a0c-b5d0-c994f3fd138e-kube-api-access-shfts\") pod \"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e\" (UID: \"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e\") " Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.190934 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b04c40b-fcee-4a0c-b5d0-c994f3fd138e-ssh-key\") pod \"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e\" (UID: \"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e\") " Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.200059 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b04c40b-fcee-4a0c-b5d0-c994f3fd138e-kube-api-access-shfts" (OuterVolumeSpecName: "kube-api-access-shfts") pod "0b04c40b-fcee-4a0c-b5d0-c994f3fd138e" (UID: "0b04c40b-fcee-4a0c-b5d0-c994f3fd138e"). InnerVolumeSpecName "kube-api-access-shfts". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.218169 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b04c40b-fcee-4a0c-b5d0-c994f3fd138e-inventory" (OuterVolumeSpecName: "inventory") pod "0b04c40b-fcee-4a0c-b5d0-c994f3fd138e" (UID: "0b04c40b-fcee-4a0c-b5d0-c994f3fd138e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.218801 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b04c40b-fcee-4a0c-b5d0-c994f3fd138e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0b04c40b-fcee-4a0c-b5d0-c994f3fd138e" (UID: "0b04c40b-fcee-4a0c-b5d0-c994f3fd138e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.293531 4757 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b04c40b-fcee-4a0c-b5d0-c994f3fd138e-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.293580 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shfts\" (UniqueName: \"kubernetes.io/projected/0b04c40b-fcee-4a0c-b5d0-c994f3fd138e-kube-api-access-shfts\") on node \"crc\" DevicePath \"\"" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.293597 4757 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b04c40b-fcee-4a0c-b5d0-c994f3fd138e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.579548 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25" event={"ID":"0b04c40b-fcee-4a0c-b5d0-c994f3fd138e","Type":"ContainerDied","Data":"2cccfefb6af36116bdb20e109275af2e29e6aa1414656cfceb57b12c1f881c75"} Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.579604 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cccfefb6af36116bdb20e109275af2e29e6aa1414656cfceb57b12c1f881c75" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.580325 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kth25" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.674432 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n"] Dec 16 13:21:39 crc kubenswrapper[4757]: E1216 13:21:39.676073 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b3b224-faae-4ab0-af23-38fd813ab25c" containerName="extract-content" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.676162 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b3b224-faae-4ab0-af23-38fd813ab25c" containerName="extract-content" Dec 16 13:21:39 crc kubenswrapper[4757]: E1216 13:21:39.676279 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b04c40b-fcee-4a0c-b5d0-c994f3fd138e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.676424 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b04c40b-fcee-4a0c-b5d0-c994f3fd138e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 16 13:21:39 crc kubenswrapper[4757]: E1216 13:21:39.676512 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b3b224-faae-4ab0-af23-38fd813ab25c" containerName="registry-server" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.676578 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b3b224-faae-4ab0-af23-38fd813ab25c" containerName="registry-server" Dec 16 13:21:39 crc kubenswrapper[4757]: E1216 13:21:39.676659 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b3b224-faae-4ab0-af23-38fd813ab25c" containerName="extract-utilities" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.676730 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b3b224-faae-4ab0-af23-38fd813ab25c" containerName="extract-utilities" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.676946 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="66b3b224-faae-4ab0-af23-38fd813ab25c" containerName="registry-server" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.677076 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b04c40b-fcee-4a0c-b5d0-c994f3fd138e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.677777 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.688742 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.688967 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.689197 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.688920 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.689794 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n"] Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.802156 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/948d5531-d301-46c5-ac1a-882ceee8df96-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nht9n\" (UID: \"948d5531-d301-46c5-ac1a-882ceee8df96\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.802381 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfcp8\" (UniqueName: \"kubernetes.io/projected/948d5531-d301-46c5-ac1a-882ceee8df96-kube-api-access-hfcp8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nht9n\" (UID: \"948d5531-d301-46c5-ac1a-882ceee8df96\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.802701 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/948d5531-d301-46c5-ac1a-882ceee8df96-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nht9n\" (UID: \"948d5531-d301-46c5-ac1a-882ceee8df96\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.905225 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/948d5531-d301-46c5-ac1a-882ceee8df96-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nht9n\" (UID: \"948d5531-d301-46c5-ac1a-882ceee8df96\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.905342 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/948d5531-d301-46c5-ac1a-882ceee8df96-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nht9n\" (UID: \"948d5531-d301-46c5-ac1a-882ceee8df96\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.905365 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfcp8\" (UniqueName: \"kubernetes.io/projected/948d5531-d301-46c5-ac1a-882ceee8df96-kube-api-access-hfcp8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nht9n\" (UID: \"948d5531-d301-46c5-ac1a-882ceee8df96\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.910931 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/948d5531-d301-46c5-ac1a-882ceee8df96-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nht9n\" (UID: \"948d5531-d301-46c5-ac1a-882ceee8df96\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.910929 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/948d5531-d301-46c5-ac1a-882ceee8df96-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nht9n\" (UID: \"948d5531-d301-46c5-ac1a-882ceee8df96\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n" Dec 16 13:21:39 crc kubenswrapper[4757]: I1216 13:21:39.923247 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfcp8\" (UniqueName: \"kubernetes.io/projected/948d5531-d301-46c5-ac1a-882ceee8df96-kube-api-access-hfcp8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nht9n\" (UID: \"948d5531-d301-46c5-ac1a-882ceee8df96\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n" Dec 16 13:21:40 crc kubenswrapper[4757]: I1216 13:21:40.005802 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n" Dec 16 13:21:40 crc kubenswrapper[4757]: I1216 13:21:40.436825 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n"] Dec 16 13:21:40 crc kubenswrapper[4757]: W1216 13:21:40.439763 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod948d5531_d301_46c5_ac1a_882ceee8df96.slice/crio-55a6e79a7ad4d145cc68e6d1ce0b1cf6f793efca5f6e9a74f5e2e8dec106d4ce WatchSource:0}: Error finding container 55a6e79a7ad4d145cc68e6d1ce0b1cf6f793efca5f6e9a74f5e2e8dec106d4ce: Status 404 returned error can't find the container with id 55a6e79a7ad4d145cc68e6d1ce0b1cf6f793efca5f6e9a74f5e2e8dec106d4ce Dec 16 13:21:40 crc kubenswrapper[4757]: I1216 13:21:40.590104 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n" event={"ID":"948d5531-d301-46c5-ac1a-882ceee8df96","Type":"ContainerStarted","Data":"55a6e79a7ad4d145cc68e6d1ce0b1cf6f793efca5f6e9a74f5e2e8dec106d4ce"} Dec 16 13:21:41 crc kubenswrapper[4757]: I1216 13:21:41.607158 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n" event={"ID":"948d5531-d301-46c5-ac1a-882ceee8df96","Type":"ContainerStarted","Data":"de83907b57564e6b43c94edfeb5b6313602f8041450283954374b3e15279b57f"} Dec 16 13:21:41 crc kubenswrapper[4757]: I1216 13:21:41.634503 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n" podStartSLOduration=1.8723147519999999 podStartE2EDuration="2.634482981s" podCreationTimestamp="2025-12-16 13:21:39 +0000 UTC" firstStartedPulling="2025-12-16 13:21:40.441760142 +0000 UTC m=+2085.869503938" lastFinishedPulling="2025-12-16 13:21:41.203928371 +0000 UTC m=+2086.631672167" observedRunningTime="2025-12-16 13:21:41.623173525 +0000 UTC m=+2087.050917321" watchObservedRunningTime="2025-12-16 13:21:41.634482981 +0000 UTC m=+2087.062226777" Dec 16 13:21:47 crc kubenswrapper[4757]: I1216 13:21:47.655270 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n" event={"ID":"948d5531-d301-46c5-ac1a-882ceee8df96","Type":"ContainerDied","Data":"de83907b57564e6b43c94edfeb5b6313602f8041450283954374b3e15279b57f"} Dec 16 13:21:47 crc kubenswrapper[4757]: I1216 13:21:47.655212 4757 generic.go:334] "Generic (PLEG): container finished" podID="948d5531-d301-46c5-ac1a-882ceee8df96" containerID="de83907b57564e6b43c94edfeb5b6313602f8041450283954374b3e15279b57f" exitCode=0 Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.099795 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.211193 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/948d5531-d301-46c5-ac1a-882ceee8df96-ssh-key\") pod \"948d5531-d301-46c5-ac1a-882ceee8df96\" (UID: \"948d5531-d301-46c5-ac1a-882ceee8df96\") " Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.211490 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfcp8\" (UniqueName: \"kubernetes.io/projected/948d5531-d301-46c5-ac1a-882ceee8df96-kube-api-access-hfcp8\") pod \"948d5531-d301-46c5-ac1a-882ceee8df96\" (UID: \"948d5531-d301-46c5-ac1a-882ceee8df96\") " Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.211570 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/948d5531-d301-46c5-ac1a-882ceee8df96-inventory\") pod \"948d5531-d301-46c5-ac1a-882ceee8df96\" (UID: \"948d5531-d301-46c5-ac1a-882ceee8df96\") " Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.218569 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/948d5531-d301-46c5-ac1a-882ceee8df96-kube-api-access-hfcp8" (OuterVolumeSpecName: "kube-api-access-hfcp8") pod "948d5531-d301-46c5-ac1a-882ceee8df96" (UID: "948d5531-d301-46c5-ac1a-882ceee8df96"). InnerVolumeSpecName "kube-api-access-hfcp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.241692 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/948d5531-d301-46c5-ac1a-882ceee8df96-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "948d5531-d301-46c5-ac1a-882ceee8df96" (UID: "948d5531-d301-46c5-ac1a-882ceee8df96"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.250358 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/948d5531-d301-46c5-ac1a-882ceee8df96-inventory" (OuterVolumeSpecName: "inventory") pod "948d5531-d301-46c5-ac1a-882ceee8df96" (UID: "948d5531-d301-46c5-ac1a-882ceee8df96"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.313927 4757 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/948d5531-d301-46c5-ac1a-882ceee8df96-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.313980 4757 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/948d5531-d301-46c5-ac1a-882ceee8df96-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.313994 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfcp8\" (UniqueName: \"kubernetes.io/projected/948d5531-d301-46c5-ac1a-882ceee8df96-kube-api-access-hfcp8\") on node \"crc\" DevicePath \"\"" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.677064 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n" event={"ID":"948d5531-d301-46c5-ac1a-882ceee8df96","Type":"ContainerDied","Data":"55a6e79a7ad4d145cc68e6d1ce0b1cf6f793efca5f6e9a74f5e2e8dec106d4ce"} Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.677324 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55a6e79a7ad4d145cc68e6d1ce0b1cf6f793efca5f6e9a74f5e2e8dec106d4ce" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.677235 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nht9n" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.765764 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k"] Dec 16 13:21:49 crc kubenswrapper[4757]: E1216 13:21:49.766134 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948d5531-d301-46c5-ac1a-882ceee8df96" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.766151 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="948d5531-d301-46c5-ac1a-882ceee8df96" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.766374 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="948d5531-d301-46c5-ac1a-882ceee8df96" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.766996 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.772092 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.772327 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.772674 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.772859 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.787154 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k"] Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.924711 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd87efc3-653f-4794-89b8-490ea0b504dd-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6fz7k\" (UID: \"cd87efc3-653f-4794-89b8-490ea0b504dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.925150 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vssd8\" (UniqueName: \"kubernetes.io/projected/cd87efc3-653f-4794-89b8-490ea0b504dd-kube-api-access-vssd8\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6fz7k\" (UID: \"cd87efc3-653f-4794-89b8-490ea0b504dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k" Dec 16 13:21:49 crc kubenswrapper[4757]: I1216 13:21:49.925282 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd87efc3-653f-4794-89b8-490ea0b504dd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6fz7k\" (UID: \"cd87efc3-653f-4794-89b8-490ea0b504dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k" Dec 16 13:21:50 crc kubenswrapper[4757]: I1216 13:21:50.027182 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vssd8\" (UniqueName: \"kubernetes.io/projected/cd87efc3-653f-4794-89b8-490ea0b504dd-kube-api-access-vssd8\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6fz7k\" (UID: \"cd87efc3-653f-4794-89b8-490ea0b504dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k" Dec 16 13:21:50 crc kubenswrapper[4757]: I1216 13:21:50.027577 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd87efc3-653f-4794-89b8-490ea0b504dd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6fz7k\" (UID: \"cd87efc3-653f-4794-89b8-490ea0b504dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k" Dec 16 13:21:50 crc kubenswrapper[4757]: I1216 13:21:50.027772 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd87efc3-653f-4794-89b8-490ea0b504dd-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6fz7k\" (UID: \"cd87efc3-653f-4794-89b8-490ea0b504dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k" Dec 16 13:21:50 crc kubenswrapper[4757]: I1216 13:21:50.032189 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd87efc3-653f-4794-89b8-490ea0b504dd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6fz7k\" (UID: \"cd87efc3-653f-4794-89b8-490ea0b504dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k" Dec 16 13:21:50 crc kubenswrapper[4757]: I1216 13:21:50.032765 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd87efc3-653f-4794-89b8-490ea0b504dd-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6fz7k\" (UID: \"cd87efc3-653f-4794-89b8-490ea0b504dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k" Dec 16 13:21:50 crc kubenswrapper[4757]: I1216 13:21:50.060706 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vssd8\" (UniqueName: \"kubernetes.io/projected/cd87efc3-653f-4794-89b8-490ea0b504dd-kube-api-access-vssd8\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-6fz7k\" (UID: \"cd87efc3-653f-4794-89b8-490ea0b504dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k" Dec 16 13:21:50 crc kubenswrapper[4757]: I1216 13:21:50.095950 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k" Dec 16 13:21:50 crc kubenswrapper[4757]: I1216 13:21:50.614777 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k"] Dec 16 13:21:50 crc kubenswrapper[4757]: I1216 13:21:50.686124 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k" event={"ID":"cd87efc3-653f-4794-89b8-490ea0b504dd","Type":"ContainerStarted","Data":"8d2b9586b66501ca3d174c8d792ffc9ef340254b69a200e1337918ced7b192cc"} Dec 16 13:21:51 crc kubenswrapper[4757]: I1216 13:21:51.696200 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k" event={"ID":"cd87efc3-653f-4794-89b8-490ea0b504dd","Type":"ContainerStarted","Data":"722d4ed2518fca15509e3ad7798d3d0b7142dec33a5140057b91abf1105b530f"} Dec 16 13:21:51 crc kubenswrapper[4757]: I1216 13:21:51.716235 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k" podStartSLOduration=1.990232013 podStartE2EDuration="2.716211467s" podCreationTimestamp="2025-12-16 13:21:49 +0000 UTC" firstStartedPulling="2025-12-16 13:21:50.637425019 +0000 UTC m=+2096.065168815" lastFinishedPulling="2025-12-16 13:21:51.363404473 +0000 UTC m=+2096.791148269" observedRunningTime="2025-12-16 13:21:51.712898814 +0000 UTC m=+2097.140642620" watchObservedRunningTime="2025-12-16 13:21:51.716211467 +0000 UTC m=+2097.143955263" Dec 16 13:21:53 crc kubenswrapper[4757]: I1216 13:21:53.041338 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-vkdzh"] Dec 16 13:21:53 crc kubenswrapper[4757]: I1216 13:21:53.048539 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-vkdzh"] Dec 16 13:21:54 crc kubenswrapper[4757]: I1216 13:21:54.964373 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="849962c6-8103-4d96-8136-23acb6221049" path="/var/lib/kubelet/pods/849962c6-8103-4d96-8136-23acb6221049/volumes" Dec 16 13:22:21 crc kubenswrapper[4757]: I1216 13:22:21.181939 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:22:21 crc kubenswrapper[4757]: I1216 13:22:21.182582 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:22:30 crc kubenswrapper[4757]: I1216 13:22:30.540253 4757 scope.go:117] "RemoveContainer" containerID="7615e52cc5d43f93c458b907c7cd86fadf052d622b2f992133ab915df9bd2a88" Dec 16 13:22:33 crc kubenswrapper[4757]: I1216 13:22:33.697864 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mzgn6"] Dec 16 13:22:33 crc kubenswrapper[4757]: I1216 13:22:33.701958 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mzgn6" Dec 16 13:22:33 crc kubenswrapper[4757]: I1216 13:22:33.707430 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mzgn6"] Dec 16 13:22:33 crc kubenswrapper[4757]: I1216 13:22:33.886612 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bb6t\" (UniqueName: \"kubernetes.io/projected/236d39a0-f6b5-4e1a-beeb-d48a92648745-kube-api-access-8bb6t\") pod \"certified-operators-mzgn6\" (UID: \"236d39a0-f6b5-4e1a-beeb-d48a92648745\") " pod="openshift-marketplace/certified-operators-mzgn6" Dec 16 13:22:33 crc kubenswrapper[4757]: I1216 13:22:33.886920 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/236d39a0-f6b5-4e1a-beeb-d48a92648745-utilities\") pod \"certified-operators-mzgn6\" (UID: \"236d39a0-f6b5-4e1a-beeb-d48a92648745\") " pod="openshift-marketplace/certified-operators-mzgn6" Dec 16 13:22:33 crc kubenswrapper[4757]: I1216 13:22:33.886959 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/236d39a0-f6b5-4e1a-beeb-d48a92648745-catalog-content\") pod \"certified-operators-mzgn6\" (UID: \"236d39a0-f6b5-4e1a-beeb-d48a92648745\") " pod="openshift-marketplace/certified-operators-mzgn6" Dec 16 13:22:33 crc kubenswrapper[4757]: I1216 13:22:33.988783 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bb6t\" (UniqueName: \"kubernetes.io/projected/236d39a0-f6b5-4e1a-beeb-d48a92648745-kube-api-access-8bb6t\") pod \"certified-operators-mzgn6\" (UID: \"236d39a0-f6b5-4e1a-beeb-d48a92648745\") " pod="openshift-marketplace/certified-operators-mzgn6" Dec 16 13:22:33 crc kubenswrapper[4757]: I1216 13:22:33.989399 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/236d39a0-f6b5-4e1a-beeb-d48a92648745-utilities\") pod \"certified-operators-mzgn6\" (UID: \"236d39a0-f6b5-4e1a-beeb-d48a92648745\") " pod="openshift-marketplace/certified-operators-mzgn6" Dec 16 13:22:33 crc kubenswrapper[4757]: I1216 13:22:33.989442 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/236d39a0-f6b5-4e1a-beeb-d48a92648745-catalog-content\") pod \"certified-operators-mzgn6\" (UID: \"236d39a0-f6b5-4e1a-beeb-d48a92648745\") " pod="openshift-marketplace/certified-operators-mzgn6" Dec 16 13:22:33 crc kubenswrapper[4757]: I1216 13:22:33.990238 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/236d39a0-f6b5-4e1a-beeb-d48a92648745-catalog-content\") pod \"certified-operators-mzgn6\" (UID: \"236d39a0-f6b5-4e1a-beeb-d48a92648745\") " pod="openshift-marketplace/certified-operators-mzgn6" Dec 16 13:22:33 crc kubenswrapper[4757]: I1216 13:22:33.990589 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/236d39a0-f6b5-4e1a-beeb-d48a92648745-utilities\") pod \"certified-operators-mzgn6\" (UID: \"236d39a0-f6b5-4e1a-beeb-d48a92648745\") " pod="openshift-marketplace/certified-operators-mzgn6" Dec 16 13:22:34 crc kubenswrapper[4757]: I1216 13:22:34.014241 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bb6t\" (UniqueName: \"kubernetes.io/projected/236d39a0-f6b5-4e1a-beeb-d48a92648745-kube-api-access-8bb6t\") pod \"certified-operators-mzgn6\" (UID: \"236d39a0-f6b5-4e1a-beeb-d48a92648745\") " pod="openshift-marketplace/certified-operators-mzgn6" Dec 16 13:22:34 crc kubenswrapper[4757]: I1216 13:22:34.024912 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mzgn6" Dec 16 13:22:34 crc kubenswrapper[4757]: I1216 13:22:34.573826 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mzgn6"] Dec 16 13:22:35 crc kubenswrapper[4757]: I1216 13:22:35.051611 4757 generic.go:334] "Generic (PLEG): container finished" podID="236d39a0-f6b5-4e1a-beeb-d48a92648745" containerID="9753f9d0985a9cafbcaa81a02c3605887c4cc74b94b8471902e2b1f9974efdc7" exitCode=0 Dec 16 13:22:35 crc kubenswrapper[4757]: I1216 13:22:35.051693 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzgn6" event={"ID":"236d39a0-f6b5-4e1a-beeb-d48a92648745","Type":"ContainerDied","Data":"9753f9d0985a9cafbcaa81a02c3605887c4cc74b94b8471902e2b1f9974efdc7"} Dec 16 13:22:35 crc kubenswrapper[4757]: I1216 13:22:35.053847 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzgn6" event={"ID":"236d39a0-f6b5-4e1a-beeb-d48a92648745","Type":"ContainerStarted","Data":"d01358b4a754eb2c52ee70fa1c2e25fcbb8ce5df18be93e4e0e0cc7d404c648a"} Dec 16 13:22:36 crc kubenswrapper[4757]: I1216 13:22:36.074152 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzgn6" event={"ID":"236d39a0-f6b5-4e1a-beeb-d48a92648745","Type":"ContainerStarted","Data":"784c97cc4e88b946a7b94ed0c91c3a592dae503f6cee84ca84c613dab09f66f9"} Dec 16 13:22:36 crc kubenswrapper[4757]: I1216 13:22:36.078582 4757 generic.go:334] "Generic (PLEG): container finished" podID="cd87efc3-653f-4794-89b8-490ea0b504dd" containerID="722d4ed2518fca15509e3ad7798d3d0b7142dec33a5140057b91abf1105b530f" exitCode=0 Dec 16 13:22:36 crc kubenswrapper[4757]: I1216 13:22:36.078693 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k" event={"ID":"cd87efc3-653f-4794-89b8-490ea0b504dd","Type":"ContainerDied","Data":"722d4ed2518fca15509e3ad7798d3d0b7142dec33a5140057b91abf1105b530f"} Dec 16 13:22:37 crc kubenswrapper[4757]: I1216 13:22:37.087687 4757 generic.go:334] "Generic (PLEG): container finished" podID="236d39a0-f6b5-4e1a-beeb-d48a92648745" containerID="784c97cc4e88b946a7b94ed0c91c3a592dae503f6cee84ca84c613dab09f66f9" exitCode=0 Dec 16 13:22:37 crc kubenswrapper[4757]: I1216 13:22:37.088969 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzgn6" event={"ID":"236d39a0-f6b5-4e1a-beeb-d48a92648745","Type":"ContainerDied","Data":"784c97cc4e88b946a7b94ed0c91c3a592dae503f6cee84ca84c613dab09f66f9"} Dec 16 13:22:37 crc kubenswrapper[4757]: I1216 13:22:37.482216 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k" Dec 16 13:22:37 crc kubenswrapper[4757]: I1216 13:22:37.581131 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd87efc3-653f-4794-89b8-490ea0b504dd-ssh-key\") pod \"cd87efc3-653f-4794-89b8-490ea0b504dd\" (UID: \"cd87efc3-653f-4794-89b8-490ea0b504dd\") " Dec 16 13:22:37 crc kubenswrapper[4757]: I1216 13:22:37.581340 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vssd8\" (UniqueName: \"kubernetes.io/projected/cd87efc3-653f-4794-89b8-490ea0b504dd-kube-api-access-vssd8\") pod \"cd87efc3-653f-4794-89b8-490ea0b504dd\" (UID: \"cd87efc3-653f-4794-89b8-490ea0b504dd\") " Dec 16 13:22:37 crc kubenswrapper[4757]: I1216 13:22:37.581384 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd87efc3-653f-4794-89b8-490ea0b504dd-inventory\") pod \"cd87efc3-653f-4794-89b8-490ea0b504dd\" (UID: \"cd87efc3-653f-4794-89b8-490ea0b504dd\") " Dec 16 13:22:37 crc kubenswrapper[4757]: I1216 13:22:37.590713 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd87efc3-653f-4794-89b8-490ea0b504dd-kube-api-access-vssd8" (OuterVolumeSpecName: "kube-api-access-vssd8") pod "cd87efc3-653f-4794-89b8-490ea0b504dd" (UID: "cd87efc3-653f-4794-89b8-490ea0b504dd"). InnerVolumeSpecName "kube-api-access-vssd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:22:37 crc kubenswrapper[4757]: I1216 13:22:37.614713 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd87efc3-653f-4794-89b8-490ea0b504dd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cd87efc3-653f-4794-89b8-490ea0b504dd" (UID: "cd87efc3-653f-4794-89b8-490ea0b504dd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:22:37 crc kubenswrapper[4757]: I1216 13:22:37.620432 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd87efc3-653f-4794-89b8-490ea0b504dd-inventory" (OuterVolumeSpecName: "inventory") pod "cd87efc3-653f-4794-89b8-490ea0b504dd" (UID: "cd87efc3-653f-4794-89b8-490ea0b504dd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:22:37 crc kubenswrapper[4757]: I1216 13:22:37.683196 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vssd8\" (UniqueName: \"kubernetes.io/projected/cd87efc3-653f-4794-89b8-490ea0b504dd-kube-api-access-vssd8\") on node \"crc\" DevicePath \"\"" Dec 16 13:22:37 crc kubenswrapper[4757]: I1216 13:22:37.683235 4757 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd87efc3-653f-4794-89b8-490ea0b504dd-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 13:22:37 crc kubenswrapper[4757]: I1216 13:22:37.683245 4757 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd87efc3-653f-4794-89b8-490ea0b504dd-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.095599 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k" event={"ID":"cd87efc3-653f-4794-89b8-490ea0b504dd","Type":"ContainerDied","Data":"8d2b9586b66501ca3d174c8d792ffc9ef340254b69a200e1337918ced7b192cc"} Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.095965 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d2b9586b66501ca3d174c8d792ffc9ef340254b69a200e1337918ced7b192cc" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.095639 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-6fz7k" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.233718 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752"] Dec 16 13:22:38 crc kubenswrapper[4757]: E1216 13:22:38.234271 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd87efc3-653f-4794-89b8-490ea0b504dd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.234300 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd87efc3-653f-4794-89b8-490ea0b504dd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.234560 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd87efc3-653f-4794-89b8-490ea0b504dd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.235339 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.237690 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.246669 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752"] Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.266087 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.267704 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.267844 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.405804 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec2b71fe-44a0-4fae-b631-f719f7d735a5-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fc752\" (UID: \"ec2b71fe-44a0-4fae-b631-f719f7d735a5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.406271 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rch9t\" (UniqueName: \"kubernetes.io/projected/ec2b71fe-44a0-4fae-b631-f719f7d735a5-kube-api-access-rch9t\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fc752\" (UID: \"ec2b71fe-44a0-4fae-b631-f719f7d735a5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.406381 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec2b71fe-44a0-4fae-b631-f719f7d735a5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fc752\" (UID: \"ec2b71fe-44a0-4fae-b631-f719f7d735a5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.507852 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec2b71fe-44a0-4fae-b631-f719f7d735a5-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fc752\" (UID: \"ec2b71fe-44a0-4fae-b631-f719f7d735a5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.507938 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rch9t\" (UniqueName: \"kubernetes.io/projected/ec2b71fe-44a0-4fae-b631-f719f7d735a5-kube-api-access-rch9t\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fc752\" (UID: \"ec2b71fe-44a0-4fae-b631-f719f7d735a5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.508052 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec2b71fe-44a0-4fae-b631-f719f7d735a5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fc752\" (UID: \"ec2b71fe-44a0-4fae-b631-f719f7d735a5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.513349 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec2b71fe-44a0-4fae-b631-f719f7d735a5-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fc752\" (UID: \"ec2b71fe-44a0-4fae-b631-f719f7d735a5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.513568 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec2b71fe-44a0-4fae-b631-f719f7d735a5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fc752\" (UID: \"ec2b71fe-44a0-4fae-b631-f719f7d735a5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.538707 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rch9t\" (UniqueName: \"kubernetes.io/projected/ec2b71fe-44a0-4fae-b631-f719f7d735a5-kube-api-access-rch9t\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fc752\" (UID: \"ec2b71fe-44a0-4fae-b631-f719f7d735a5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752" Dec 16 13:22:38 crc kubenswrapper[4757]: I1216 13:22:38.564273 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752" Dec 16 13:22:39 crc kubenswrapper[4757]: I1216 13:22:39.108898 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzgn6" event={"ID":"236d39a0-f6b5-4e1a-beeb-d48a92648745","Type":"ContainerStarted","Data":"92625825822b661e192b3bfcfc953365c9974015314b200322ffb313b917d198"} Dec 16 13:22:39 crc kubenswrapper[4757]: I1216 13:22:39.133170 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mzgn6" podStartSLOduration=3.404101097 podStartE2EDuration="6.133149597s" podCreationTimestamp="2025-12-16 13:22:33 +0000 UTC" firstStartedPulling="2025-12-16 13:22:35.053475733 +0000 UTC m=+2140.481219529" lastFinishedPulling="2025-12-16 13:22:37.782524233 +0000 UTC m=+2143.210268029" observedRunningTime="2025-12-16 13:22:39.132727286 +0000 UTC m=+2144.560471082" watchObservedRunningTime="2025-12-16 13:22:39.133149597 +0000 UTC m=+2144.560893403" Dec 16 13:22:39 crc kubenswrapper[4757]: I1216 13:22:39.184921 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752"] Dec 16 13:22:39 crc kubenswrapper[4757]: W1216 13:22:39.187455 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec2b71fe_44a0_4fae_b631_f719f7d735a5.slice/crio-2dbd082a1aadff62de086acc90e41de06279bc35d9db4d704148438ddf553cc6 WatchSource:0}: Error finding container 2dbd082a1aadff62de086acc90e41de06279bc35d9db4d704148438ddf553cc6: Status 404 returned error can't find the container with id 2dbd082a1aadff62de086acc90e41de06279bc35d9db4d704148438ddf553cc6 Dec 16 13:22:40 crc kubenswrapper[4757]: I1216 13:22:40.118844 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752" event={"ID":"ec2b71fe-44a0-4fae-b631-f719f7d735a5","Type":"ContainerStarted","Data":"2dbd082a1aadff62de086acc90e41de06279bc35d9db4d704148438ddf553cc6"} Dec 16 13:22:41 crc kubenswrapper[4757]: I1216 13:22:41.135076 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752" event={"ID":"ec2b71fe-44a0-4fae-b631-f719f7d735a5","Type":"ContainerStarted","Data":"af1f9a6d4f8716b947c44ec7c314a5800fc12128a70b6a6ce82d29f434d89b0c"} Dec 16 13:22:44 crc kubenswrapper[4757]: I1216 13:22:44.025510 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mzgn6" Dec 16 13:22:44 crc kubenswrapper[4757]: I1216 13:22:44.026433 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mzgn6" Dec 16 13:22:44 crc kubenswrapper[4757]: I1216 13:22:44.070474 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mzgn6" Dec 16 13:22:44 crc kubenswrapper[4757]: I1216 13:22:44.095644 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752" podStartSLOduration=5.112172806 podStartE2EDuration="6.095625941s" podCreationTimestamp="2025-12-16 13:22:38 +0000 UTC" firstStartedPulling="2025-12-16 13:22:39.190453267 +0000 UTC m=+2144.618197063" lastFinishedPulling="2025-12-16 13:22:40.173906382 +0000 UTC m=+2145.601650198" observedRunningTime="2025-12-16 13:22:41.160310474 +0000 UTC m=+2146.588054270" watchObservedRunningTime="2025-12-16 13:22:44.095625941 +0000 UTC m=+2149.523369737" Dec 16 13:22:44 crc kubenswrapper[4757]: I1216 13:22:44.206754 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mzgn6" Dec 16 13:22:44 crc kubenswrapper[4757]: I1216 13:22:44.315076 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mzgn6"] Dec 16 13:22:46 crc kubenswrapper[4757]: I1216 13:22:46.175956 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mzgn6" podUID="236d39a0-f6b5-4e1a-beeb-d48a92648745" containerName="registry-server" containerID="cri-o://92625825822b661e192b3bfcfc953365c9974015314b200322ffb313b917d198" gracePeriod=2 Dec 16 13:22:46 crc kubenswrapper[4757]: I1216 13:22:46.647297 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mzgn6" Dec 16 13:22:46 crc kubenswrapper[4757]: I1216 13:22:46.818986 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bb6t\" (UniqueName: \"kubernetes.io/projected/236d39a0-f6b5-4e1a-beeb-d48a92648745-kube-api-access-8bb6t\") pod \"236d39a0-f6b5-4e1a-beeb-d48a92648745\" (UID: \"236d39a0-f6b5-4e1a-beeb-d48a92648745\") " Dec 16 13:22:46 crc kubenswrapper[4757]: I1216 13:22:46.819092 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/236d39a0-f6b5-4e1a-beeb-d48a92648745-utilities\") pod \"236d39a0-f6b5-4e1a-beeb-d48a92648745\" (UID: \"236d39a0-f6b5-4e1a-beeb-d48a92648745\") " Dec 16 13:22:46 crc kubenswrapper[4757]: I1216 13:22:46.819363 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/236d39a0-f6b5-4e1a-beeb-d48a92648745-catalog-content\") pod \"236d39a0-f6b5-4e1a-beeb-d48a92648745\" (UID: \"236d39a0-f6b5-4e1a-beeb-d48a92648745\") " Dec 16 13:22:46 crc kubenswrapper[4757]: I1216 13:22:46.819863 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/236d39a0-f6b5-4e1a-beeb-d48a92648745-utilities" (OuterVolumeSpecName: "utilities") pod "236d39a0-f6b5-4e1a-beeb-d48a92648745" (UID: "236d39a0-f6b5-4e1a-beeb-d48a92648745"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:22:46 crc kubenswrapper[4757]: I1216 13:22:46.820172 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/236d39a0-f6b5-4e1a-beeb-d48a92648745-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:22:46 crc kubenswrapper[4757]: I1216 13:22:46.827324 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/236d39a0-f6b5-4e1a-beeb-d48a92648745-kube-api-access-8bb6t" (OuterVolumeSpecName: "kube-api-access-8bb6t") pod "236d39a0-f6b5-4e1a-beeb-d48a92648745" (UID: "236d39a0-f6b5-4e1a-beeb-d48a92648745"). InnerVolumeSpecName "kube-api-access-8bb6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:22:46 crc kubenswrapper[4757]: I1216 13:22:46.869403 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/236d39a0-f6b5-4e1a-beeb-d48a92648745-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "236d39a0-f6b5-4e1a-beeb-d48a92648745" (UID: "236d39a0-f6b5-4e1a-beeb-d48a92648745"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:22:46 crc kubenswrapper[4757]: I1216 13:22:46.921938 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bb6t\" (UniqueName: \"kubernetes.io/projected/236d39a0-f6b5-4e1a-beeb-d48a92648745-kube-api-access-8bb6t\") on node \"crc\" DevicePath \"\"" Dec 16 13:22:46 crc kubenswrapper[4757]: I1216 13:22:46.922259 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/236d39a0-f6b5-4e1a-beeb-d48a92648745-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:22:47 crc kubenswrapper[4757]: I1216 13:22:47.186443 4757 generic.go:334] "Generic (PLEG): container finished" podID="236d39a0-f6b5-4e1a-beeb-d48a92648745" containerID="92625825822b661e192b3bfcfc953365c9974015314b200322ffb313b917d198" exitCode=0 Dec 16 13:22:47 crc kubenswrapper[4757]: I1216 13:22:47.186555 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzgn6" event={"ID":"236d39a0-f6b5-4e1a-beeb-d48a92648745","Type":"ContainerDied","Data":"92625825822b661e192b3bfcfc953365c9974015314b200322ffb313b917d198"} Dec 16 13:22:47 crc kubenswrapper[4757]: I1216 13:22:47.186630 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mzgn6" Dec 16 13:22:47 crc kubenswrapper[4757]: I1216 13:22:47.187688 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzgn6" event={"ID":"236d39a0-f6b5-4e1a-beeb-d48a92648745","Type":"ContainerDied","Data":"d01358b4a754eb2c52ee70fa1c2e25fcbb8ce5df18be93e4e0e0cc7d404c648a"} Dec 16 13:22:47 crc kubenswrapper[4757]: I1216 13:22:47.187790 4757 scope.go:117] "RemoveContainer" containerID="92625825822b661e192b3bfcfc953365c9974015314b200322ffb313b917d198" Dec 16 13:22:47 crc kubenswrapper[4757]: I1216 13:22:47.240423 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mzgn6"] Dec 16 13:22:47 crc kubenswrapper[4757]: I1216 13:22:47.240890 4757 scope.go:117] "RemoveContainer" containerID="784c97cc4e88b946a7b94ed0c91c3a592dae503f6cee84ca84c613dab09f66f9" Dec 16 13:22:47 crc kubenswrapper[4757]: I1216 13:22:47.250653 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mzgn6"] Dec 16 13:22:47 crc kubenswrapper[4757]: I1216 13:22:47.259904 4757 scope.go:117] "RemoveContainer" containerID="9753f9d0985a9cafbcaa81a02c3605887c4cc74b94b8471902e2b1f9974efdc7" Dec 16 13:22:47 crc kubenswrapper[4757]: I1216 13:22:47.313670 4757 scope.go:117] "RemoveContainer" containerID="92625825822b661e192b3bfcfc953365c9974015314b200322ffb313b917d198" Dec 16 13:22:47 crc kubenswrapper[4757]: E1216 13:22:47.314194 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92625825822b661e192b3bfcfc953365c9974015314b200322ffb313b917d198\": container with ID starting with 92625825822b661e192b3bfcfc953365c9974015314b200322ffb313b917d198 not found: ID does not exist" containerID="92625825822b661e192b3bfcfc953365c9974015314b200322ffb313b917d198" Dec 16 13:22:47 crc kubenswrapper[4757]: I1216 13:22:47.314249 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92625825822b661e192b3bfcfc953365c9974015314b200322ffb313b917d198"} err="failed to get container status \"92625825822b661e192b3bfcfc953365c9974015314b200322ffb313b917d198\": rpc error: code = NotFound desc = could not find container \"92625825822b661e192b3bfcfc953365c9974015314b200322ffb313b917d198\": container with ID starting with 92625825822b661e192b3bfcfc953365c9974015314b200322ffb313b917d198 not found: ID does not exist" Dec 16 13:22:47 crc kubenswrapper[4757]: I1216 13:22:47.314284 4757 scope.go:117] "RemoveContainer" containerID="784c97cc4e88b946a7b94ed0c91c3a592dae503f6cee84ca84c613dab09f66f9" Dec 16 13:22:47 crc kubenswrapper[4757]: E1216 13:22:47.314707 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"784c97cc4e88b946a7b94ed0c91c3a592dae503f6cee84ca84c613dab09f66f9\": container with ID starting with 784c97cc4e88b946a7b94ed0c91c3a592dae503f6cee84ca84c613dab09f66f9 not found: ID does not exist" containerID="784c97cc4e88b946a7b94ed0c91c3a592dae503f6cee84ca84c613dab09f66f9" Dec 16 13:22:47 crc kubenswrapper[4757]: I1216 13:22:47.314746 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"784c97cc4e88b946a7b94ed0c91c3a592dae503f6cee84ca84c613dab09f66f9"} err="failed to get container status \"784c97cc4e88b946a7b94ed0c91c3a592dae503f6cee84ca84c613dab09f66f9\": rpc error: code = NotFound desc = could not find container \"784c97cc4e88b946a7b94ed0c91c3a592dae503f6cee84ca84c613dab09f66f9\": container with ID starting with 784c97cc4e88b946a7b94ed0c91c3a592dae503f6cee84ca84c613dab09f66f9 not found: ID does not exist" Dec 16 13:22:47 crc kubenswrapper[4757]: I1216 13:22:47.314774 4757 scope.go:117] "RemoveContainer" containerID="9753f9d0985a9cafbcaa81a02c3605887c4cc74b94b8471902e2b1f9974efdc7" Dec 16 13:22:47 crc kubenswrapper[4757]: E1216 13:22:47.315172 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9753f9d0985a9cafbcaa81a02c3605887c4cc74b94b8471902e2b1f9974efdc7\": container with ID starting with 9753f9d0985a9cafbcaa81a02c3605887c4cc74b94b8471902e2b1f9974efdc7 not found: ID does not exist" containerID="9753f9d0985a9cafbcaa81a02c3605887c4cc74b94b8471902e2b1f9974efdc7" Dec 16 13:22:47 crc kubenswrapper[4757]: I1216 13:22:47.315203 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9753f9d0985a9cafbcaa81a02c3605887c4cc74b94b8471902e2b1f9974efdc7"} err="failed to get container status \"9753f9d0985a9cafbcaa81a02c3605887c4cc74b94b8471902e2b1f9974efdc7\": rpc error: code = NotFound desc = could not find container \"9753f9d0985a9cafbcaa81a02c3605887c4cc74b94b8471902e2b1f9974efdc7\": container with ID starting with 9753f9d0985a9cafbcaa81a02c3605887c4cc74b94b8471902e2b1f9974efdc7 not found: ID does not exist" Dec 16 13:22:48 crc kubenswrapper[4757]: I1216 13:22:48.959421 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="236d39a0-f6b5-4e1a-beeb-d48a92648745" path="/var/lib/kubelet/pods/236d39a0-f6b5-4e1a-beeb-d48a92648745/volumes" Dec 16 13:22:51 crc kubenswrapper[4757]: I1216 13:22:51.181479 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:22:51 crc kubenswrapper[4757]: I1216 13:22:51.181550 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:23:15 crc kubenswrapper[4757]: I1216 13:23:15.079255 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2lm6t"] Dec 16 13:23:15 crc kubenswrapper[4757]: E1216 13:23:15.080339 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236d39a0-f6b5-4e1a-beeb-d48a92648745" containerName="extract-content" Dec 16 13:23:15 crc kubenswrapper[4757]: I1216 13:23:15.080359 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="236d39a0-f6b5-4e1a-beeb-d48a92648745" containerName="extract-content" Dec 16 13:23:15 crc kubenswrapper[4757]: E1216 13:23:15.080379 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236d39a0-f6b5-4e1a-beeb-d48a92648745" containerName="registry-server" Dec 16 13:23:15 crc kubenswrapper[4757]: I1216 13:23:15.080387 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="236d39a0-f6b5-4e1a-beeb-d48a92648745" containerName="registry-server" Dec 16 13:23:15 crc kubenswrapper[4757]: E1216 13:23:15.080401 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236d39a0-f6b5-4e1a-beeb-d48a92648745" containerName="extract-utilities" Dec 16 13:23:15 crc kubenswrapper[4757]: I1216 13:23:15.080410 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="236d39a0-f6b5-4e1a-beeb-d48a92648745" containerName="extract-utilities" Dec 16 13:23:15 crc kubenswrapper[4757]: I1216 13:23:15.080625 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="236d39a0-f6b5-4e1a-beeb-d48a92648745" containerName="registry-server" Dec 16 13:23:15 crc kubenswrapper[4757]: I1216 13:23:15.082354 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2lm6t" Dec 16 13:23:15 crc kubenswrapper[4757]: I1216 13:23:15.100723 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2lm6t"] Dec 16 13:23:15 crc kubenswrapper[4757]: I1216 13:23:15.206165 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9f4adb8-27d0-4a47-a24f-e7339822318c-utilities\") pod \"redhat-marketplace-2lm6t\" (UID: \"d9f4adb8-27d0-4a47-a24f-e7339822318c\") " pod="openshift-marketplace/redhat-marketplace-2lm6t" Dec 16 13:23:15 crc kubenswrapper[4757]: I1216 13:23:15.206269 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2ksv\" (UniqueName: \"kubernetes.io/projected/d9f4adb8-27d0-4a47-a24f-e7339822318c-kube-api-access-z2ksv\") pod \"redhat-marketplace-2lm6t\" (UID: \"d9f4adb8-27d0-4a47-a24f-e7339822318c\") " pod="openshift-marketplace/redhat-marketplace-2lm6t" Dec 16 13:23:15 crc kubenswrapper[4757]: I1216 13:23:15.206314 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9f4adb8-27d0-4a47-a24f-e7339822318c-catalog-content\") pod \"redhat-marketplace-2lm6t\" (UID: \"d9f4adb8-27d0-4a47-a24f-e7339822318c\") " pod="openshift-marketplace/redhat-marketplace-2lm6t" Dec 16 13:23:15 crc kubenswrapper[4757]: I1216 13:23:15.319817 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9f4adb8-27d0-4a47-a24f-e7339822318c-utilities\") pod \"redhat-marketplace-2lm6t\" (UID: \"d9f4adb8-27d0-4a47-a24f-e7339822318c\") " pod="openshift-marketplace/redhat-marketplace-2lm6t" Dec 16 13:23:15 crc kubenswrapper[4757]: I1216 13:23:15.319889 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2ksv\" (UniqueName: \"kubernetes.io/projected/d9f4adb8-27d0-4a47-a24f-e7339822318c-kube-api-access-z2ksv\") pod \"redhat-marketplace-2lm6t\" (UID: \"d9f4adb8-27d0-4a47-a24f-e7339822318c\") " pod="openshift-marketplace/redhat-marketplace-2lm6t" Dec 16 13:23:15 crc kubenswrapper[4757]: I1216 13:23:15.319919 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9f4adb8-27d0-4a47-a24f-e7339822318c-catalog-content\") pod \"redhat-marketplace-2lm6t\" (UID: \"d9f4adb8-27d0-4a47-a24f-e7339822318c\") " pod="openshift-marketplace/redhat-marketplace-2lm6t" Dec 16 13:23:15 crc kubenswrapper[4757]: I1216 13:23:15.320383 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9f4adb8-27d0-4a47-a24f-e7339822318c-utilities\") pod \"redhat-marketplace-2lm6t\" (UID: \"d9f4adb8-27d0-4a47-a24f-e7339822318c\") " pod="openshift-marketplace/redhat-marketplace-2lm6t" Dec 16 13:23:15 crc kubenswrapper[4757]: I1216 13:23:15.320440 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9f4adb8-27d0-4a47-a24f-e7339822318c-catalog-content\") pod \"redhat-marketplace-2lm6t\" (UID: \"d9f4adb8-27d0-4a47-a24f-e7339822318c\") " pod="openshift-marketplace/redhat-marketplace-2lm6t" Dec 16 13:23:15 crc kubenswrapper[4757]: I1216 13:23:15.351046 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2ksv\" (UniqueName: \"kubernetes.io/projected/d9f4adb8-27d0-4a47-a24f-e7339822318c-kube-api-access-z2ksv\") pod \"redhat-marketplace-2lm6t\" (UID: \"d9f4adb8-27d0-4a47-a24f-e7339822318c\") " pod="openshift-marketplace/redhat-marketplace-2lm6t" Dec 16 13:23:15 crc kubenswrapper[4757]: I1216 13:23:15.405578 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2lm6t" Dec 16 13:23:15 crc kubenswrapper[4757]: I1216 13:23:15.911523 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2lm6t"] Dec 16 13:23:15 crc kubenswrapper[4757]: W1216 13:23:15.924107 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9f4adb8_27d0_4a47_a24f_e7339822318c.slice/crio-6fb10869bd4c56161cb149681059628171d5aad1a095c42fcba3c1d215c52543 WatchSource:0}: Error finding container 6fb10869bd4c56161cb149681059628171d5aad1a095c42fcba3c1d215c52543: Status 404 returned error can't find the container with id 6fb10869bd4c56161cb149681059628171d5aad1a095c42fcba3c1d215c52543 Dec 16 13:23:16 crc kubenswrapper[4757]: I1216 13:23:16.462446 4757 generic.go:334] "Generic (PLEG): container finished" podID="d9f4adb8-27d0-4a47-a24f-e7339822318c" containerID="c8d669c20766bff947f5b1c6a30232f1a2f6893c3e594e7ef5e2bf34c70fa071" exitCode=0 Dec 16 13:23:16 crc kubenswrapper[4757]: I1216 13:23:16.462494 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2lm6t" event={"ID":"d9f4adb8-27d0-4a47-a24f-e7339822318c","Type":"ContainerDied","Data":"c8d669c20766bff947f5b1c6a30232f1a2f6893c3e594e7ef5e2bf34c70fa071"} Dec 16 13:23:16 crc kubenswrapper[4757]: I1216 13:23:16.462525 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2lm6t" event={"ID":"d9f4adb8-27d0-4a47-a24f-e7339822318c","Type":"ContainerStarted","Data":"6fb10869bd4c56161cb149681059628171d5aad1a095c42fcba3c1d215c52543"} Dec 16 13:23:17 crc kubenswrapper[4757]: I1216 13:23:17.474515 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2lm6t" event={"ID":"d9f4adb8-27d0-4a47-a24f-e7339822318c","Type":"ContainerStarted","Data":"b9c5d717287ee81e6c1502afd83c6e8665ecd3372d29a4a3b458212c81094171"} Dec 16 13:23:18 crc kubenswrapper[4757]: I1216 13:23:18.485291 4757 generic.go:334] "Generic (PLEG): container finished" podID="d9f4adb8-27d0-4a47-a24f-e7339822318c" containerID="b9c5d717287ee81e6c1502afd83c6e8665ecd3372d29a4a3b458212c81094171" exitCode=0 Dec 16 13:23:18 crc kubenswrapper[4757]: I1216 13:23:18.485409 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2lm6t" event={"ID":"d9f4adb8-27d0-4a47-a24f-e7339822318c","Type":"ContainerDied","Data":"b9c5d717287ee81e6c1502afd83c6e8665ecd3372d29a4a3b458212c81094171"} Dec 16 13:23:20 crc kubenswrapper[4757]: I1216 13:23:20.502859 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2lm6t" event={"ID":"d9f4adb8-27d0-4a47-a24f-e7339822318c","Type":"ContainerStarted","Data":"c39de0da2a9c63d124161dc78c7b5da3d2180746899c00bcf8beea91fef1b420"} Dec 16 13:23:20 crc kubenswrapper[4757]: I1216 13:23:20.528525 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2lm6t" podStartSLOduration=2.163925809 podStartE2EDuration="5.528502455s" podCreationTimestamp="2025-12-16 13:23:15 +0000 UTC" firstStartedPulling="2025-12-16 13:23:16.464549517 +0000 UTC m=+2181.892293313" lastFinishedPulling="2025-12-16 13:23:19.829126163 +0000 UTC m=+2185.256869959" observedRunningTime="2025-12-16 13:23:20.523343064 +0000 UTC m=+2185.951086850" watchObservedRunningTime="2025-12-16 13:23:20.528502455 +0000 UTC m=+2185.956246251" Dec 16 13:23:21 crc kubenswrapper[4757]: I1216 13:23:21.181071 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:23:21 crc kubenswrapper[4757]: I1216 13:23:21.181366 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:23:21 crc kubenswrapper[4757]: I1216 13:23:21.181582 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 13:23:21 crc kubenswrapper[4757]: I1216 13:23:21.182373 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 13:23:21 crc kubenswrapper[4757]: I1216 13:23:21.182434 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" gracePeriod=600 Dec 16 13:23:21 crc kubenswrapper[4757]: E1216 13:23:21.321702 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:23:21 crc kubenswrapper[4757]: I1216 13:23:21.513891 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" exitCode=0 Dec 16 13:23:21 crc kubenswrapper[4757]: I1216 13:23:21.513989 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf"} Dec 16 13:23:21 crc kubenswrapper[4757]: I1216 13:23:21.514062 4757 scope.go:117] "RemoveContainer" containerID="eca1d98db7f374bb43377a0b9dbe9ada84f08b48c42f15512ead147338cbbe40" Dec 16 13:23:21 crc kubenswrapper[4757]: I1216 13:23:21.514912 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:23:21 crc kubenswrapper[4757]: E1216 13:23:21.515192 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:23:25 crc kubenswrapper[4757]: I1216 13:23:25.406409 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2lm6t" Dec 16 13:23:25 crc kubenswrapper[4757]: I1216 13:23:25.407102 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2lm6t" Dec 16 13:23:25 crc kubenswrapper[4757]: I1216 13:23:25.463089 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2lm6t" Dec 16 13:23:25 crc kubenswrapper[4757]: I1216 13:23:25.605059 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2lm6t" Dec 16 13:23:25 crc kubenswrapper[4757]: I1216 13:23:25.703311 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2lm6t"] Dec 16 13:23:27 crc kubenswrapper[4757]: I1216 13:23:27.564507 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2lm6t" podUID="d9f4adb8-27d0-4a47-a24f-e7339822318c" containerName="registry-server" containerID="cri-o://c39de0da2a9c63d124161dc78c7b5da3d2180746899c00bcf8beea91fef1b420" gracePeriod=2 Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.157537 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2lm6t" Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.360045 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9f4adb8-27d0-4a47-a24f-e7339822318c-utilities\") pod \"d9f4adb8-27d0-4a47-a24f-e7339822318c\" (UID: \"d9f4adb8-27d0-4a47-a24f-e7339822318c\") " Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.360220 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9f4adb8-27d0-4a47-a24f-e7339822318c-catalog-content\") pod \"d9f4adb8-27d0-4a47-a24f-e7339822318c\" (UID: \"d9f4adb8-27d0-4a47-a24f-e7339822318c\") " Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.360273 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2ksv\" (UniqueName: \"kubernetes.io/projected/d9f4adb8-27d0-4a47-a24f-e7339822318c-kube-api-access-z2ksv\") pod \"d9f4adb8-27d0-4a47-a24f-e7339822318c\" (UID: \"d9f4adb8-27d0-4a47-a24f-e7339822318c\") " Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.361212 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9f4adb8-27d0-4a47-a24f-e7339822318c-utilities" (OuterVolumeSpecName: "utilities") pod "d9f4adb8-27d0-4a47-a24f-e7339822318c" (UID: "d9f4adb8-27d0-4a47-a24f-e7339822318c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.371937 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9f4adb8-27d0-4a47-a24f-e7339822318c-kube-api-access-z2ksv" (OuterVolumeSpecName: "kube-api-access-z2ksv") pod "d9f4adb8-27d0-4a47-a24f-e7339822318c" (UID: "d9f4adb8-27d0-4a47-a24f-e7339822318c"). InnerVolumeSpecName "kube-api-access-z2ksv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.386583 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9f4adb8-27d0-4a47-a24f-e7339822318c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9f4adb8-27d0-4a47-a24f-e7339822318c" (UID: "d9f4adb8-27d0-4a47-a24f-e7339822318c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.462276 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9f4adb8-27d0-4a47-a24f-e7339822318c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.462629 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2ksv\" (UniqueName: \"kubernetes.io/projected/d9f4adb8-27d0-4a47-a24f-e7339822318c-kube-api-access-z2ksv\") on node \"crc\" DevicePath \"\"" Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.462705 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9f4adb8-27d0-4a47-a24f-e7339822318c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.575380 4757 generic.go:334] "Generic (PLEG): container finished" podID="d9f4adb8-27d0-4a47-a24f-e7339822318c" containerID="c39de0da2a9c63d124161dc78c7b5da3d2180746899c00bcf8beea91fef1b420" exitCode=0 Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.575478 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2lm6t" Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.575489 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2lm6t" event={"ID":"d9f4adb8-27d0-4a47-a24f-e7339822318c","Type":"ContainerDied","Data":"c39de0da2a9c63d124161dc78c7b5da3d2180746899c00bcf8beea91fef1b420"} Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.576299 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2lm6t" event={"ID":"d9f4adb8-27d0-4a47-a24f-e7339822318c","Type":"ContainerDied","Data":"6fb10869bd4c56161cb149681059628171d5aad1a095c42fcba3c1d215c52543"} Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.576324 4757 scope.go:117] "RemoveContainer" containerID="c39de0da2a9c63d124161dc78c7b5da3d2180746899c00bcf8beea91fef1b420" Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.609336 4757 scope.go:117] "RemoveContainer" containerID="b9c5d717287ee81e6c1502afd83c6e8665ecd3372d29a4a3b458212c81094171" Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.620545 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2lm6t"] Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.628707 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2lm6t"] Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.638257 4757 scope.go:117] "RemoveContainer" containerID="c8d669c20766bff947f5b1c6a30232f1a2f6893c3e594e7ef5e2bf34c70fa071" Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.673699 4757 scope.go:117] "RemoveContainer" containerID="c39de0da2a9c63d124161dc78c7b5da3d2180746899c00bcf8beea91fef1b420" Dec 16 13:23:28 crc kubenswrapper[4757]: E1216 13:23:28.674145 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c39de0da2a9c63d124161dc78c7b5da3d2180746899c00bcf8beea91fef1b420\": container with ID starting with c39de0da2a9c63d124161dc78c7b5da3d2180746899c00bcf8beea91fef1b420 not found: ID does not exist" containerID="c39de0da2a9c63d124161dc78c7b5da3d2180746899c00bcf8beea91fef1b420" Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.674211 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c39de0da2a9c63d124161dc78c7b5da3d2180746899c00bcf8beea91fef1b420"} err="failed to get container status \"c39de0da2a9c63d124161dc78c7b5da3d2180746899c00bcf8beea91fef1b420\": rpc error: code = NotFound desc = could not find container \"c39de0da2a9c63d124161dc78c7b5da3d2180746899c00bcf8beea91fef1b420\": container with ID starting with c39de0da2a9c63d124161dc78c7b5da3d2180746899c00bcf8beea91fef1b420 not found: ID does not exist" Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.674241 4757 scope.go:117] "RemoveContainer" containerID="b9c5d717287ee81e6c1502afd83c6e8665ecd3372d29a4a3b458212c81094171" Dec 16 13:23:28 crc kubenswrapper[4757]: E1216 13:23:28.674574 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9c5d717287ee81e6c1502afd83c6e8665ecd3372d29a4a3b458212c81094171\": container with ID starting with b9c5d717287ee81e6c1502afd83c6e8665ecd3372d29a4a3b458212c81094171 not found: ID does not exist" containerID="b9c5d717287ee81e6c1502afd83c6e8665ecd3372d29a4a3b458212c81094171" Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.674617 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9c5d717287ee81e6c1502afd83c6e8665ecd3372d29a4a3b458212c81094171"} err="failed to get container status \"b9c5d717287ee81e6c1502afd83c6e8665ecd3372d29a4a3b458212c81094171\": rpc error: code = NotFound desc = could not find container \"b9c5d717287ee81e6c1502afd83c6e8665ecd3372d29a4a3b458212c81094171\": container with ID starting with b9c5d717287ee81e6c1502afd83c6e8665ecd3372d29a4a3b458212c81094171 not found: ID does not exist" Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.674645 4757 scope.go:117] "RemoveContainer" containerID="c8d669c20766bff947f5b1c6a30232f1a2f6893c3e594e7ef5e2bf34c70fa071" Dec 16 13:23:28 crc kubenswrapper[4757]: E1216 13:23:28.674959 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8d669c20766bff947f5b1c6a30232f1a2f6893c3e594e7ef5e2bf34c70fa071\": container with ID starting with c8d669c20766bff947f5b1c6a30232f1a2f6893c3e594e7ef5e2bf34c70fa071 not found: ID does not exist" containerID="c8d669c20766bff947f5b1c6a30232f1a2f6893c3e594e7ef5e2bf34c70fa071" Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.674986 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8d669c20766bff947f5b1c6a30232f1a2f6893c3e594e7ef5e2bf34c70fa071"} err="failed to get container status \"c8d669c20766bff947f5b1c6a30232f1a2f6893c3e594e7ef5e2bf34c70fa071\": rpc error: code = NotFound desc = could not find container \"c8d669c20766bff947f5b1c6a30232f1a2f6893c3e594e7ef5e2bf34c70fa071\": container with ID starting with c8d669c20766bff947f5b1c6a30232f1a2f6893c3e594e7ef5e2bf34c70fa071 not found: ID does not exist" Dec 16 13:23:28 crc kubenswrapper[4757]: I1216 13:23:28.959532 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9f4adb8-27d0-4a47-a24f-e7339822318c" path="/var/lib/kubelet/pods/d9f4adb8-27d0-4a47-a24f-e7339822318c/volumes" Dec 16 13:23:35 crc kubenswrapper[4757]: I1216 13:23:35.948695 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:23:35 crc kubenswrapper[4757]: E1216 13:23:35.949540 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:23:37 crc kubenswrapper[4757]: I1216 13:23:37.649821 4757 generic.go:334] "Generic (PLEG): container finished" podID="ec2b71fe-44a0-4fae-b631-f719f7d735a5" containerID="af1f9a6d4f8716b947c44ec7c314a5800fc12128a70b6a6ce82d29f434d89b0c" exitCode=0 Dec 16 13:23:37 crc kubenswrapper[4757]: I1216 13:23:37.650161 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752" event={"ID":"ec2b71fe-44a0-4fae-b631-f719f7d735a5","Type":"ContainerDied","Data":"af1f9a6d4f8716b947c44ec7c314a5800fc12128a70b6a6ce82d29f434d89b0c"} Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.145608 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.261314 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rch9t\" (UniqueName: \"kubernetes.io/projected/ec2b71fe-44a0-4fae-b631-f719f7d735a5-kube-api-access-rch9t\") pod \"ec2b71fe-44a0-4fae-b631-f719f7d735a5\" (UID: \"ec2b71fe-44a0-4fae-b631-f719f7d735a5\") " Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.261499 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec2b71fe-44a0-4fae-b631-f719f7d735a5-inventory\") pod \"ec2b71fe-44a0-4fae-b631-f719f7d735a5\" (UID: \"ec2b71fe-44a0-4fae-b631-f719f7d735a5\") " Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.261598 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec2b71fe-44a0-4fae-b631-f719f7d735a5-ssh-key\") pod \"ec2b71fe-44a0-4fae-b631-f719f7d735a5\" (UID: \"ec2b71fe-44a0-4fae-b631-f719f7d735a5\") " Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.269359 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec2b71fe-44a0-4fae-b631-f719f7d735a5-kube-api-access-rch9t" (OuterVolumeSpecName: "kube-api-access-rch9t") pod "ec2b71fe-44a0-4fae-b631-f719f7d735a5" (UID: "ec2b71fe-44a0-4fae-b631-f719f7d735a5"). InnerVolumeSpecName "kube-api-access-rch9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.290410 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec2b71fe-44a0-4fae-b631-f719f7d735a5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ec2b71fe-44a0-4fae-b631-f719f7d735a5" (UID: "ec2b71fe-44a0-4fae-b631-f719f7d735a5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.299779 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec2b71fe-44a0-4fae-b631-f719f7d735a5-inventory" (OuterVolumeSpecName: "inventory") pod "ec2b71fe-44a0-4fae-b631-f719f7d735a5" (UID: "ec2b71fe-44a0-4fae-b631-f719f7d735a5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.364590 4757 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec2b71fe-44a0-4fae-b631-f719f7d735a5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.364877 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rch9t\" (UniqueName: \"kubernetes.io/projected/ec2b71fe-44a0-4fae-b631-f719f7d735a5-kube-api-access-rch9t\") on node \"crc\" DevicePath \"\"" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.364957 4757 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec2b71fe-44a0-4fae-b631-f719f7d735a5-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.680161 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752" event={"ID":"ec2b71fe-44a0-4fae-b631-f719f7d735a5","Type":"ContainerDied","Data":"2dbd082a1aadff62de086acc90e41de06279bc35d9db4d704148438ddf553cc6"} Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.680242 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dbd082a1aadff62de086acc90e41de06279bc35d9db4d704148438ddf553cc6" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.680335 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fc752" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.825667 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xnv2c"] Dec 16 13:23:39 crc kubenswrapper[4757]: E1216 13:23:39.830545 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec2b71fe-44a0-4fae-b631-f719f7d735a5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.830601 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec2b71fe-44a0-4fae-b631-f719f7d735a5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 16 13:23:39 crc kubenswrapper[4757]: E1216 13:23:39.830624 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f4adb8-27d0-4a47-a24f-e7339822318c" containerName="extract-utilities" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.830633 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f4adb8-27d0-4a47-a24f-e7339822318c" containerName="extract-utilities" Dec 16 13:23:39 crc kubenswrapper[4757]: E1216 13:23:39.830675 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f4adb8-27d0-4a47-a24f-e7339822318c" containerName="extract-content" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.830686 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f4adb8-27d0-4a47-a24f-e7339822318c" containerName="extract-content" Dec 16 13:23:39 crc kubenswrapper[4757]: E1216 13:23:39.830708 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f4adb8-27d0-4a47-a24f-e7339822318c" containerName="registry-server" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.830716 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f4adb8-27d0-4a47-a24f-e7339822318c" containerName="registry-server" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.830946 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f4adb8-27d0-4a47-a24f-e7339822318c" containerName="registry-server" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.830978 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec2b71fe-44a0-4fae-b631-f719f7d735a5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.831755 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xnv2c" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.839420 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.839643 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.839875 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.839953 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.867104 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xnv2c"] Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.981190 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6j8z\" (UniqueName: \"kubernetes.io/projected/228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c-kube-api-access-t6j8z\") pod \"ssh-known-hosts-edpm-deployment-xnv2c\" (UID: \"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c\") " pod="openstack/ssh-known-hosts-edpm-deployment-xnv2c" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.981596 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xnv2c\" (UID: \"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c\") " pod="openstack/ssh-known-hosts-edpm-deployment-xnv2c" Dec 16 13:23:39 crc kubenswrapper[4757]: I1216 13:23:39.981732 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xnv2c\" (UID: \"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c\") " pod="openstack/ssh-known-hosts-edpm-deployment-xnv2c" Dec 16 13:23:40 crc kubenswrapper[4757]: I1216 13:23:40.084613 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6j8z\" (UniqueName: \"kubernetes.io/projected/228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c-kube-api-access-t6j8z\") pod \"ssh-known-hosts-edpm-deployment-xnv2c\" (UID: \"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c\") " pod="openstack/ssh-known-hosts-edpm-deployment-xnv2c" Dec 16 13:23:40 crc kubenswrapper[4757]: I1216 13:23:40.084751 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xnv2c\" (UID: \"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c\") " pod="openstack/ssh-known-hosts-edpm-deployment-xnv2c" Dec 16 13:23:40 crc kubenswrapper[4757]: I1216 13:23:40.084814 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xnv2c\" (UID: \"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c\") " pod="openstack/ssh-known-hosts-edpm-deployment-xnv2c" Dec 16 13:23:40 crc kubenswrapper[4757]: I1216 13:23:40.088791 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xnv2c\" (UID: \"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c\") " pod="openstack/ssh-known-hosts-edpm-deployment-xnv2c" Dec 16 13:23:40 crc kubenswrapper[4757]: I1216 13:23:40.089524 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xnv2c\" (UID: \"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c\") " pod="openstack/ssh-known-hosts-edpm-deployment-xnv2c" Dec 16 13:23:40 crc kubenswrapper[4757]: I1216 13:23:40.106095 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6j8z\" (UniqueName: \"kubernetes.io/projected/228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c-kube-api-access-t6j8z\") pod \"ssh-known-hosts-edpm-deployment-xnv2c\" (UID: \"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c\") " pod="openstack/ssh-known-hosts-edpm-deployment-xnv2c" Dec 16 13:23:40 crc kubenswrapper[4757]: I1216 13:23:40.162137 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xnv2c" Dec 16 13:23:40 crc kubenswrapper[4757]: I1216 13:23:40.744445 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xnv2c"] Dec 16 13:23:41 crc kubenswrapper[4757]: I1216 13:23:41.697205 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xnv2c" event={"ID":"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c","Type":"ContainerStarted","Data":"fbe2042f7f65c6aef975783d6e1ecadd57589020359391d6c263470e098bf1f7"} Dec 16 13:23:41 crc kubenswrapper[4757]: I1216 13:23:41.697738 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xnv2c" event={"ID":"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c","Type":"ContainerStarted","Data":"bc7bc26fdbd3521795cb360237e8ea77cf683e74964482bf41c9c14cde51e11b"} Dec 16 13:23:41 crc kubenswrapper[4757]: I1216 13:23:41.716777 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-xnv2c" podStartSLOduration=2.026821467 podStartE2EDuration="2.716752698s" podCreationTimestamp="2025-12-16 13:23:39 +0000 UTC" firstStartedPulling="2025-12-16 13:23:40.747360493 +0000 UTC m=+2206.175104289" lastFinishedPulling="2025-12-16 13:23:41.437291714 +0000 UTC m=+2206.865035520" observedRunningTime="2025-12-16 13:23:41.712626197 +0000 UTC m=+2207.140370003" watchObservedRunningTime="2025-12-16 13:23:41.716752698 +0000 UTC m=+2207.144496494" Dec 16 13:23:44 crc kubenswrapper[4757]: I1216 13:23:44.381924 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vkmfk"] Dec 16 13:23:44 crc kubenswrapper[4757]: I1216 13:23:44.385247 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkmfk" Dec 16 13:23:44 crc kubenswrapper[4757]: I1216 13:23:44.439335 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vkmfk"] Dec 16 13:23:44 crc kubenswrapper[4757]: I1216 13:23:44.466380 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpqxf\" (UniqueName: \"kubernetes.io/projected/e23fd006-96ea-4d79-963c-659a77c987bb-kube-api-access-qpqxf\") pod \"community-operators-vkmfk\" (UID: \"e23fd006-96ea-4d79-963c-659a77c987bb\") " pod="openshift-marketplace/community-operators-vkmfk" Dec 16 13:23:44 crc kubenswrapper[4757]: I1216 13:23:44.466514 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e23fd006-96ea-4d79-963c-659a77c987bb-catalog-content\") pod \"community-operators-vkmfk\" (UID: \"e23fd006-96ea-4d79-963c-659a77c987bb\") " pod="openshift-marketplace/community-operators-vkmfk" Dec 16 13:23:44 crc kubenswrapper[4757]: I1216 13:23:44.466666 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e23fd006-96ea-4d79-963c-659a77c987bb-utilities\") pod \"community-operators-vkmfk\" (UID: \"e23fd006-96ea-4d79-963c-659a77c987bb\") " pod="openshift-marketplace/community-operators-vkmfk" Dec 16 13:23:44 crc kubenswrapper[4757]: I1216 13:23:44.568096 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e23fd006-96ea-4d79-963c-659a77c987bb-utilities\") pod \"community-operators-vkmfk\" (UID: \"e23fd006-96ea-4d79-963c-659a77c987bb\") " pod="openshift-marketplace/community-operators-vkmfk" Dec 16 13:23:44 crc kubenswrapper[4757]: I1216 13:23:44.568202 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpqxf\" (UniqueName: \"kubernetes.io/projected/e23fd006-96ea-4d79-963c-659a77c987bb-kube-api-access-qpqxf\") pod \"community-operators-vkmfk\" (UID: \"e23fd006-96ea-4d79-963c-659a77c987bb\") " pod="openshift-marketplace/community-operators-vkmfk" Dec 16 13:23:44 crc kubenswrapper[4757]: I1216 13:23:44.568282 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e23fd006-96ea-4d79-963c-659a77c987bb-catalog-content\") pod \"community-operators-vkmfk\" (UID: \"e23fd006-96ea-4d79-963c-659a77c987bb\") " pod="openshift-marketplace/community-operators-vkmfk" Dec 16 13:23:44 crc kubenswrapper[4757]: I1216 13:23:44.569254 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e23fd006-96ea-4d79-963c-659a77c987bb-utilities\") pod \"community-operators-vkmfk\" (UID: \"e23fd006-96ea-4d79-963c-659a77c987bb\") " pod="openshift-marketplace/community-operators-vkmfk" Dec 16 13:23:44 crc kubenswrapper[4757]: I1216 13:23:44.569276 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e23fd006-96ea-4d79-963c-659a77c987bb-catalog-content\") pod \"community-operators-vkmfk\" (UID: \"e23fd006-96ea-4d79-963c-659a77c987bb\") " pod="openshift-marketplace/community-operators-vkmfk" Dec 16 13:23:44 crc kubenswrapper[4757]: I1216 13:23:44.607852 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpqxf\" (UniqueName: \"kubernetes.io/projected/e23fd006-96ea-4d79-963c-659a77c987bb-kube-api-access-qpqxf\") pod \"community-operators-vkmfk\" (UID: \"e23fd006-96ea-4d79-963c-659a77c987bb\") " pod="openshift-marketplace/community-operators-vkmfk" Dec 16 13:23:44 crc kubenswrapper[4757]: I1216 13:23:44.703865 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkmfk" Dec 16 13:23:45 crc kubenswrapper[4757]: I1216 13:23:45.292117 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vkmfk"] Dec 16 13:23:45 crc kubenswrapper[4757]: I1216 13:23:45.738852 4757 generic.go:334] "Generic (PLEG): container finished" podID="e23fd006-96ea-4d79-963c-659a77c987bb" containerID="f5d0a84cc08d0d6fd6a61e98e0c942fbfed58224d2c0dc75be560653623e062b" exitCode=0 Dec 16 13:23:45 crc kubenswrapper[4757]: I1216 13:23:45.738963 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkmfk" event={"ID":"e23fd006-96ea-4d79-963c-659a77c987bb","Type":"ContainerDied","Data":"f5d0a84cc08d0d6fd6a61e98e0c942fbfed58224d2c0dc75be560653623e062b"} Dec 16 13:23:45 crc kubenswrapper[4757]: I1216 13:23:45.739118 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkmfk" event={"ID":"e23fd006-96ea-4d79-963c-659a77c987bb","Type":"ContainerStarted","Data":"f398d0c6139520b5645f245b2c27d138af7837d072bb6d5b095ff359d468ab7d"} Dec 16 13:23:47 crc kubenswrapper[4757]: I1216 13:23:47.756026 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkmfk" event={"ID":"e23fd006-96ea-4d79-963c-659a77c987bb","Type":"ContainerStarted","Data":"8b77b4632a332be2486e3dfac3b176c095bfedc07a56a301f5a378498e8a0827"} Dec 16 13:23:47 crc kubenswrapper[4757]: I1216 13:23:47.948897 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:23:47 crc kubenswrapper[4757]: E1216 13:23:47.949393 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:23:48 crc kubenswrapper[4757]: I1216 13:23:48.766892 4757 generic.go:334] "Generic (PLEG): container finished" podID="e23fd006-96ea-4d79-963c-659a77c987bb" containerID="8b77b4632a332be2486e3dfac3b176c095bfedc07a56a301f5a378498e8a0827" exitCode=0 Dec 16 13:23:48 crc kubenswrapper[4757]: I1216 13:23:48.766982 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkmfk" event={"ID":"e23fd006-96ea-4d79-963c-659a77c987bb","Type":"ContainerDied","Data":"8b77b4632a332be2486e3dfac3b176c095bfedc07a56a301f5a378498e8a0827"} Dec 16 13:23:49 crc kubenswrapper[4757]: I1216 13:23:49.779717 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkmfk" event={"ID":"e23fd006-96ea-4d79-963c-659a77c987bb","Type":"ContainerStarted","Data":"ffca193a60c9890f529655abd20c21564d0669a3852d8984e27754f64664c595"} Dec 16 13:23:49 crc kubenswrapper[4757]: I1216 13:23:49.781671 4757 generic.go:334] "Generic (PLEG): container finished" podID="228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c" containerID="fbe2042f7f65c6aef975783d6e1ecadd57589020359391d6c263470e098bf1f7" exitCode=0 Dec 16 13:23:49 crc kubenswrapper[4757]: I1216 13:23:49.781711 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xnv2c" event={"ID":"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c","Type":"ContainerDied","Data":"fbe2042f7f65c6aef975783d6e1ecadd57589020359391d6c263470e098bf1f7"} Dec 16 13:23:49 crc kubenswrapper[4757]: I1216 13:23:49.809460 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vkmfk" podStartSLOduration=2.32159224 podStartE2EDuration="5.809443908s" podCreationTimestamp="2025-12-16 13:23:44 +0000 UTC" firstStartedPulling="2025-12-16 13:23:45.744283872 +0000 UTC m=+2211.172027668" lastFinishedPulling="2025-12-16 13:23:49.23213554 +0000 UTC m=+2214.659879336" observedRunningTime="2025-12-16 13:23:49.800757204 +0000 UTC m=+2215.228501000" watchObservedRunningTime="2025-12-16 13:23:49.809443908 +0000 UTC m=+2215.237187704" Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.220955 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xnv2c" Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.403806 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6j8z\" (UniqueName: \"kubernetes.io/projected/228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c-kube-api-access-t6j8z\") pod \"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c\" (UID: \"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c\") " Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.404179 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c-inventory-0\") pod \"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c\" (UID: \"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c\") " Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.404237 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c-ssh-key-openstack-edpm-ipam\") pod \"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c\" (UID: \"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c\") " Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.410437 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c-kube-api-access-t6j8z" (OuterVolumeSpecName: "kube-api-access-t6j8z") pod "228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c" (UID: "228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c"). InnerVolumeSpecName "kube-api-access-t6j8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.434910 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c" (UID: "228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.437629 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c" (UID: "228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.506612 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6j8z\" (UniqueName: \"kubernetes.io/projected/228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c-kube-api-access-t6j8z\") on node \"crc\" DevicePath \"\"" Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.506646 4757 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.506656 4757 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.799263 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xnv2c" event={"ID":"228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c","Type":"ContainerDied","Data":"bc7bc26fdbd3521795cb360237e8ea77cf683e74964482bf41c9c14cde51e11b"} Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.799301 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc7bc26fdbd3521795cb360237e8ea77cf683e74964482bf41c9c14cde51e11b" Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.799323 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xnv2c" Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.894937 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh"] Dec 16 13:23:51 crc kubenswrapper[4757]: E1216 13:23:51.895326 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c" containerName="ssh-known-hosts-edpm-deployment" Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.895342 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c" containerName="ssh-known-hosts-edpm-deployment" Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.895507 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c" containerName="ssh-known-hosts-edpm-deployment" Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.896116 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh" Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.901055 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf" Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.901293 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.901471 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.901747 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:23:51 crc kubenswrapper[4757]: I1216 13:23:51.954355 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh"] Dec 16 13:23:52 crc kubenswrapper[4757]: I1216 13:23:52.017314 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af434566-0202-4f31-a55c-440b7ae410e6-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kdgqh\" (UID: \"af434566-0202-4f31-a55c-440b7ae410e6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh" Dec 16 13:23:52 crc kubenswrapper[4757]: I1216 13:23:52.017374 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af434566-0202-4f31-a55c-440b7ae410e6-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kdgqh\" (UID: \"af434566-0202-4f31-a55c-440b7ae410e6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh" Dec 16 13:23:52 crc kubenswrapper[4757]: I1216 13:23:52.017436 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnkkq\" (UniqueName: \"kubernetes.io/projected/af434566-0202-4f31-a55c-440b7ae410e6-kube-api-access-fnkkq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kdgqh\" (UID: \"af434566-0202-4f31-a55c-440b7ae410e6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh" Dec 16 13:23:52 crc kubenswrapper[4757]: I1216 13:23:52.120357 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af434566-0202-4f31-a55c-440b7ae410e6-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kdgqh\" (UID: \"af434566-0202-4f31-a55c-440b7ae410e6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh" Dec 16 13:23:52 crc kubenswrapper[4757]: I1216 13:23:52.120418 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af434566-0202-4f31-a55c-440b7ae410e6-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kdgqh\" (UID: \"af434566-0202-4f31-a55c-440b7ae410e6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh" Dec 16 13:23:52 crc kubenswrapper[4757]: I1216 13:23:52.120494 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnkkq\" (UniqueName: \"kubernetes.io/projected/af434566-0202-4f31-a55c-440b7ae410e6-kube-api-access-fnkkq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kdgqh\" (UID: \"af434566-0202-4f31-a55c-440b7ae410e6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh" Dec 16 13:23:52 crc kubenswrapper[4757]: I1216 13:23:52.125858 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af434566-0202-4f31-a55c-440b7ae410e6-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kdgqh\" (UID: \"af434566-0202-4f31-a55c-440b7ae410e6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh" Dec 16 13:23:52 crc kubenswrapper[4757]: I1216 13:23:52.127358 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af434566-0202-4f31-a55c-440b7ae410e6-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kdgqh\" (UID: \"af434566-0202-4f31-a55c-440b7ae410e6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh" Dec 16 13:23:52 crc kubenswrapper[4757]: I1216 13:23:52.140454 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnkkq\" (UniqueName: \"kubernetes.io/projected/af434566-0202-4f31-a55c-440b7ae410e6-kube-api-access-fnkkq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kdgqh\" (UID: \"af434566-0202-4f31-a55c-440b7ae410e6\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh" Dec 16 13:23:52 crc kubenswrapper[4757]: I1216 13:23:52.212436 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh" Dec 16 13:23:52 crc kubenswrapper[4757]: I1216 13:23:52.738863 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh"] Dec 16 13:23:52 crc kubenswrapper[4757]: W1216 13:23:52.740902 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf434566_0202_4f31_a55c_440b7ae410e6.slice/crio-719a9599881c26876e5e9c844334c59a51556b09ba45757fc325f721bf3ff209 WatchSource:0}: Error finding container 719a9599881c26876e5e9c844334c59a51556b09ba45757fc325f721bf3ff209: Status 404 returned error can't find the container with id 719a9599881c26876e5e9c844334c59a51556b09ba45757fc325f721bf3ff209 Dec 16 13:23:52 crc kubenswrapper[4757]: I1216 13:23:52.829905 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh" event={"ID":"af434566-0202-4f31-a55c-440b7ae410e6","Type":"ContainerStarted","Data":"719a9599881c26876e5e9c844334c59a51556b09ba45757fc325f721bf3ff209"} Dec 16 13:23:53 crc kubenswrapper[4757]: I1216 13:23:53.838891 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh" event={"ID":"af434566-0202-4f31-a55c-440b7ae410e6","Type":"ContainerStarted","Data":"7b74af0b762c93c95743fc9494502a2c0e50c5f36defbdeb2fd1ad513cc5a800"} Dec 16 13:23:53 crc kubenswrapper[4757]: I1216 13:23:53.855356 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh" podStartSLOduration=2.223245105 podStartE2EDuration="2.855334671s" podCreationTimestamp="2025-12-16 13:23:51 +0000 UTC" firstStartedPulling="2025-12-16 13:23:52.743885623 +0000 UTC m=+2218.171629419" lastFinishedPulling="2025-12-16 13:23:53.375975189 +0000 UTC m=+2218.803718985" observedRunningTime="2025-12-16 13:23:53.853835304 +0000 UTC m=+2219.281579100" watchObservedRunningTime="2025-12-16 13:23:53.855334671 +0000 UTC m=+2219.283078467" Dec 16 13:23:54 crc kubenswrapper[4757]: I1216 13:23:54.704609 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vkmfk" Dec 16 13:23:54 crc kubenswrapper[4757]: I1216 13:23:54.704930 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vkmfk" Dec 16 13:23:54 crc kubenswrapper[4757]: I1216 13:23:54.759654 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vkmfk" Dec 16 13:23:54 crc kubenswrapper[4757]: I1216 13:23:54.905213 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vkmfk" Dec 16 13:23:54 crc kubenswrapper[4757]: I1216 13:23:54.997532 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vkmfk"] Dec 16 13:23:56 crc kubenswrapper[4757]: I1216 13:23:56.863992 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vkmfk" podUID="e23fd006-96ea-4d79-963c-659a77c987bb" containerName="registry-server" containerID="cri-o://ffca193a60c9890f529655abd20c21564d0669a3852d8984e27754f64664c595" gracePeriod=2 Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.352918 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkmfk" Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.524062 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e23fd006-96ea-4d79-963c-659a77c987bb-catalog-content\") pod \"e23fd006-96ea-4d79-963c-659a77c987bb\" (UID: \"e23fd006-96ea-4d79-963c-659a77c987bb\") " Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.524157 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e23fd006-96ea-4d79-963c-659a77c987bb-utilities\") pod \"e23fd006-96ea-4d79-963c-659a77c987bb\" (UID: \"e23fd006-96ea-4d79-963c-659a77c987bb\") " Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.524225 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpqxf\" (UniqueName: \"kubernetes.io/projected/e23fd006-96ea-4d79-963c-659a77c987bb-kube-api-access-qpqxf\") pod \"e23fd006-96ea-4d79-963c-659a77c987bb\" (UID: \"e23fd006-96ea-4d79-963c-659a77c987bb\") " Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.525394 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e23fd006-96ea-4d79-963c-659a77c987bb-utilities" (OuterVolumeSpecName: "utilities") pod "e23fd006-96ea-4d79-963c-659a77c987bb" (UID: "e23fd006-96ea-4d79-963c-659a77c987bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.533466 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e23fd006-96ea-4d79-963c-659a77c987bb-kube-api-access-qpqxf" (OuterVolumeSpecName: "kube-api-access-qpqxf") pod "e23fd006-96ea-4d79-963c-659a77c987bb" (UID: "e23fd006-96ea-4d79-963c-659a77c987bb"). InnerVolumeSpecName "kube-api-access-qpqxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.579774 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e23fd006-96ea-4d79-963c-659a77c987bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e23fd006-96ea-4d79-963c-659a77c987bb" (UID: "e23fd006-96ea-4d79-963c-659a77c987bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.626594 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpqxf\" (UniqueName: \"kubernetes.io/projected/e23fd006-96ea-4d79-963c-659a77c987bb-kube-api-access-qpqxf\") on node \"crc\" DevicePath \"\"" Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.626813 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e23fd006-96ea-4d79-963c-659a77c987bb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.626822 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e23fd006-96ea-4d79-963c-659a77c987bb-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.873743 4757 generic.go:334] "Generic (PLEG): container finished" podID="e23fd006-96ea-4d79-963c-659a77c987bb" containerID="ffca193a60c9890f529655abd20c21564d0669a3852d8984e27754f64664c595" exitCode=0 Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.873798 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkmfk" event={"ID":"e23fd006-96ea-4d79-963c-659a77c987bb","Type":"ContainerDied","Data":"ffca193a60c9890f529655abd20c21564d0669a3852d8984e27754f64664c595"} Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.873833 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkmfk" event={"ID":"e23fd006-96ea-4d79-963c-659a77c987bb","Type":"ContainerDied","Data":"f398d0c6139520b5645f245b2c27d138af7837d072bb6d5b095ff359d468ab7d"} Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.873854 4757 scope.go:117] "RemoveContainer" containerID="ffca193a60c9890f529655abd20c21564d0669a3852d8984e27754f64664c595" Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.874029 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkmfk" Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.912425 4757 scope.go:117] "RemoveContainer" containerID="8b77b4632a332be2486e3dfac3b176c095bfedc07a56a301f5a378498e8a0827" Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.913200 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vkmfk"] Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.923674 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vkmfk"] Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.933429 4757 scope.go:117] "RemoveContainer" containerID="f5d0a84cc08d0d6fd6a61e98e0c942fbfed58224d2c0dc75be560653623e062b" Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.974128 4757 scope.go:117] "RemoveContainer" containerID="ffca193a60c9890f529655abd20c21564d0669a3852d8984e27754f64664c595" Dec 16 13:23:57 crc kubenswrapper[4757]: E1216 13:23:57.975515 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffca193a60c9890f529655abd20c21564d0669a3852d8984e27754f64664c595\": container with ID starting with ffca193a60c9890f529655abd20c21564d0669a3852d8984e27754f64664c595 not found: ID does not exist" containerID="ffca193a60c9890f529655abd20c21564d0669a3852d8984e27754f64664c595" Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.975561 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffca193a60c9890f529655abd20c21564d0669a3852d8984e27754f64664c595"} err="failed to get container status \"ffca193a60c9890f529655abd20c21564d0669a3852d8984e27754f64664c595\": rpc error: code = NotFound desc = could not find container \"ffca193a60c9890f529655abd20c21564d0669a3852d8984e27754f64664c595\": container with ID starting with ffca193a60c9890f529655abd20c21564d0669a3852d8984e27754f64664c595 not found: ID does not exist" Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.975585 4757 scope.go:117] "RemoveContainer" containerID="8b77b4632a332be2486e3dfac3b176c095bfedc07a56a301f5a378498e8a0827" Dec 16 13:23:57 crc kubenswrapper[4757]: E1216 13:23:57.975968 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b77b4632a332be2486e3dfac3b176c095bfedc07a56a301f5a378498e8a0827\": container with ID starting with 8b77b4632a332be2486e3dfac3b176c095bfedc07a56a301f5a378498e8a0827 not found: ID does not exist" containerID="8b77b4632a332be2486e3dfac3b176c095bfedc07a56a301f5a378498e8a0827" Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.975992 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b77b4632a332be2486e3dfac3b176c095bfedc07a56a301f5a378498e8a0827"} err="failed to get container status \"8b77b4632a332be2486e3dfac3b176c095bfedc07a56a301f5a378498e8a0827\": rpc error: code = NotFound desc = could not find container \"8b77b4632a332be2486e3dfac3b176c095bfedc07a56a301f5a378498e8a0827\": container with ID starting with 8b77b4632a332be2486e3dfac3b176c095bfedc07a56a301f5a378498e8a0827 not found: ID does not exist" Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.976019 4757 scope.go:117] "RemoveContainer" containerID="f5d0a84cc08d0d6fd6a61e98e0c942fbfed58224d2c0dc75be560653623e062b" Dec 16 13:23:57 crc kubenswrapper[4757]: E1216 13:23:57.976329 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5d0a84cc08d0d6fd6a61e98e0c942fbfed58224d2c0dc75be560653623e062b\": container with ID starting with f5d0a84cc08d0d6fd6a61e98e0c942fbfed58224d2c0dc75be560653623e062b not found: ID does not exist" containerID="f5d0a84cc08d0d6fd6a61e98e0c942fbfed58224d2c0dc75be560653623e062b" Dec 16 13:23:57 crc kubenswrapper[4757]: I1216 13:23:57.976453 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5d0a84cc08d0d6fd6a61e98e0c942fbfed58224d2c0dc75be560653623e062b"} err="failed to get container status \"f5d0a84cc08d0d6fd6a61e98e0c942fbfed58224d2c0dc75be560653623e062b\": rpc error: code = NotFound desc = could not find container \"f5d0a84cc08d0d6fd6a61e98e0c942fbfed58224d2c0dc75be560653623e062b\": container with ID starting with f5d0a84cc08d0d6fd6a61e98e0c942fbfed58224d2c0dc75be560653623e062b not found: ID does not exist" Dec 16 13:23:58 crc kubenswrapper[4757]: I1216 13:23:58.959647 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e23fd006-96ea-4d79-963c-659a77c987bb" path="/var/lib/kubelet/pods/e23fd006-96ea-4d79-963c-659a77c987bb/volumes" Dec 16 13:23:59 crc kubenswrapper[4757]: I1216 13:23:59.948956 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:23:59 crc kubenswrapper[4757]: E1216 13:23:59.949567 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:24:02 crc kubenswrapper[4757]: I1216 13:24:02.923813 4757 generic.go:334] "Generic (PLEG): container finished" podID="af434566-0202-4f31-a55c-440b7ae410e6" containerID="7b74af0b762c93c95743fc9494502a2c0e50c5f36defbdeb2fd1ad513cc5a800" exitCode=0 Dec 16 13:24:02 crc kubenswrapper[4757]: I1216 13:24:02.924033 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh" event={"ID":"af434566-0202-4f31-a55c-440b7ae410e6","Type":"ContainerDied","Data":"7b74af0b762c93c95743fc9494502a2c0e50c5f36defbdeb2fd1ad513cc5a800"} Dec 16 13:24:04 crc kubenswrapper[4757]: I1216 13:24:04.359727 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh" Dec 16 13:24:04 crc kubenswrapper[4757]: I1216 13:24:04.501778 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af434566-0202-4f31-a55c-440b7ae410e6-ssh-key\") pod \"af434566-0202-4f31-a55c-440b7ae410e6\" (UID: \"af434566-0202-4f31-a55c-440b7ae410e6\") " Dec 16 13:24:04 crc kubenswrapper[4757]: I1216 13:24:04.501874 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af434566-0202-4f31-a55c-440b7ae410e6-inventory\") pod \"af434566-0202-4f31-a55c-440b7ae410e6\" (UID: \"af434566-0202-4f31-a55c-440b7ae410e6\") " Dec 16 13:24:04 crc kubenswrapper[4757]: I1216 13:24:04.502134 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnkkq\" (UniqueName: \"kubernetes.io/projected/af434566-0202-4f31-a55c-440b7ae410e6-kube-api-access-fnkkq\") pod \"af434566-0202-4f31-a55c-440b7ae410e6\" (UID: \"af434566-0202-4f31-a55c-440b7ae410e6\") " Dec 16 13:24:04 crc kubenswrapper[4757]: I1216 13:24:04.507683 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af434566-0202-4f31-a55c-440b7ae410e6-kube-api-access-fnkkq" (OuterVolumeSpecName: "kube-api-access-fnkkq") pod "af434566-0202-4f31-a55c-440b7ae410e6" (UID: "af434566-0202-4f31-a55c-440b7ae410e6"). InnerVolumeSpecName "kube-api-access-fnkkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:24:04 crc kubenswrapper[4757]: I1216 13:24:04.529321 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af434566-0202-4f31-a55c-440b7ae410e6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "af434566-0202-4f31-a55c-440b7ae410e6" (UID: "af434566-0202-4f31-a55c-440b7ae410e6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:24:04 crc kubenswrapper[4757]: I1216 13:24:04.538352 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af434566-0202-4f31-a55c-440b7ae410e6-inventory" (OuterVolumeSpecName: "inventory") pod "af434566-0202-4f31-a55c-440b7ae410e6" (UID: "af434566-0202-4f31-a55c-440b7ae410e6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:24:04 crc kubenswrapper[4757]: I1216 13:24:04.604463 4757 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af434566-0202-4f31-a55c-440b7ae410e6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:24:04 crc kubenswrapper[4757]: I1216 13:24:04.604500 4757 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af434566-0202-4f31-a55c-440b7ae410e6-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 13:24:04 crc kubenswrapper[4757]: I1216 13:24:04.604517 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnkkq\" (UniqueName: \"kubernetes.io/projected/af434566-0202-4f31-a55c-440b7ae410e6-kube-api-access-fnkkq\") on node \"crc\" DevicePath \"\"" Dec 16 13:24:04 crc kubenswrapper[4757]: I1216 13:24:04.944895 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh" event={"ID":"af434566-0202-4f31-a55c-440b7ae410e6","Type":"ContainerDied","Data":"719a9599881c26876e5e9c844334c59a51556b09ba45757fc325f721bf3ff209"} Dec 16 13:24:04 crc kubenswrapper[4757]: I1216 13:24:04.944932 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="719a9599881c26876e5e9c844334c59a51556b09ba45757fc325f721bf3ff209" Dec 16 13:24:04 crc kubenswrapper[4757]: I1216 13:24:04.945617 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kdgqh" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.027097 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9"] Dec 16 13:24:05 crc kubenswrapper[4757]: E1216 13:24:05.028557 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e23fd006-96ea-4d79-963c-659a77c987bb" containerName="registry-server" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.028594 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="e23fd006-96ea-4d79-963c-659a77c987bb" containerName="registry-server" Dec 16 13:24:05 crc kubenswrapper[4757]: E1216 13:24:05.028610 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af434566-0202-4f31-a55c-440b7ae410e6" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.028619 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="af434566-0202-4f31-a55c-440b7ae410e6" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 16 13:24:05 crc kubenswrapper[4757]: E1216 13:24:05.028641 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e23fd006-96ea-4d79-963c-659a77c987bb" containerName="extract-content" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.028653 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="e23fd006-96ea-4d79-963c-659a77c987bb" containerName="extract-content" Dec 16 13:24:05 crc kubenswrapper[4757]: E1216 13:24:05.028674 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e23fd006-96ea-4d79-963c-659a77c987bb" containerName="extract-utilities" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.028680 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="e23fd006-96ea-4d79-963c-659a77c987bb" containerName="extract-utilities" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.030809 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="e23fd006-96ea-4d79-963c-659a77c987bb" containerName="registry-server" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.030837 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="af434566-0202-4f31-a55c-440b7ae410e6" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.036124 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.042368 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.046533 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.046951 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.048570 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.059888 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9"] Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.217916 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4c2d838-cc46-4457-9b88-5ea6eb7f14e4-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9\" (UID: \"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.220578 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4c2d838-cc46-4457-9b88-5ea6eb7f14e4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9\" (UID: \"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.220943 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnqc8\" (UniqueName: \"kubernetes.io/projected/f4c2d838-cc46-4457-9b88-5ea6eb7f14e4-kube-api-access-gnqc8\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9\" (UID: \"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.322705 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4c2d838-cc46-4457-9b88-5ea6eb7f14e4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9\" (UID: \"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.322809 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnqc8\" (UniqueName: \"kubernetes.io/projected/f4c2d838-cc46-4457-9b88-5ea6eb7f14e4-kube-api-access-gnqc8\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9\" (UID: \"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.322876 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4c2d838-cc46-4457-9b88-5ea6eb7f14e4-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9\" (UID: \"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.332739 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4c2d838-cc46-4457-9b88-5ea6eb7f14e4-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9\" (UID: \"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.342598 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4c2d838-cc46-4457-9b88-5ea6eb7f14e4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9\" (UID: \"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.350786 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnqc8\" (UniqueName: \"kubernetes.io/projected/f4c2d838-cc46-4457-9b88-5ea6eb7f14e4-kube-api-access-gnqc8\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9\" (UID: \"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.382325 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9" Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.924400 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9"] Dec 16 13:24:05 crc kubenswrapper[4757]: I1216 13:24:05.959877 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9" event={"ID":"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4","Type":"ContainerStarted","Data":"244c675aa9e2fa27def94e946e8fede04b5cfdf961004bf1de10b9555d6ad45f"} Dec 16 13:24:06 crc kubenswrapper[4757]: I1216 13:24:06.972713 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9" event={"ID":"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4","Type":"ContainerStarted","Data":"85d6ddef2a9c24ebc71aa2fd09807b7dcdb1e6c47778367241b4eaad4d8a84d5"} Dec 16 13:24:06 crc kubenswrapper[4757]: I1216 13:24:06.998183 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9" podStartSLOduration=1.403858428 podStartE2EDuration="1.9981642s" podCreationTimestamp="2025-12-16 13:24:05 +0000 UTC" firstStartedPulling="2025-12-16 13:24:05.936292225 +0000 UTC m=+2231.364036021" lastFinishedPulling="2025-12-16 13:24:06.530597997 +0000 UTC m=+2231.958341793" observedRunningTime="2025-12-16 13:24:06.98955102 +0000 UTC m=+2232.417294826" watchObservedRunningTime="2025-12-16 13:24:06.9981642 +0000 UTC m=+2232.425907996" Dec 16 13:24:13 crc kubenswrapper[4757]: I1216 13:24:13.949394 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:24:13 crc kubenswrapper[4757]: E1216 13:24:13.950091 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:24:17 crc kubenswrapper[4757]: I1216 13:24:17.060478 4757 generic.go:334] "Generic (PLEG): container finished" podID="f4c2d838-cc46-4457-9b88-5ea6eb7f14e4" containerID="85d6ddef2a9c24ebc71aa2fd09807b7dcdb1e6c47778367241b4eaad4d8a84d5" exitCode=0 Dec 16 13:24:17 crc kubenswrapper[4757]: I1216 13:24:17.060576 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9" event={"ID":"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4","Type":"ContainerDied","Data":"85d6ddef2a9c24ebc71aa2fd09807b7dcdb1e6c47778367241b4eaad4d8a84d5"} Dec 16 13:24:18 crc kubenswrapper[4757]: I1216 13:24:18.484983 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9" Dec 16 13:24:18 crc kubenswrapper[4757]: I1216 13:24:18.585719 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4c2d838-cc46-4457-9b88-5ea6eb7f14e4-inventory\") pod \"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4\" (UID: \"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4\") " Dec 16 13:24:18 crc kubenswrapper[4757]: I1216 13:24:18.585769 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnqc8\" (UniqueName: \"kubernetes.io/projected/f4c2d838-cc46-4457-9b88-5ea6eb7f14e4-kube-api-access-gnqc8\") pod \"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4\" (UID: \"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4\") " Dec 16 13:24:18 crc kubenswrapper[4757]: I1216 13:24:18.585846 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4c2d838-cc46-4457-9b88-5ea6eb7f14e4-ssh-key\") pod \"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4\" (UID: \"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4\") " Dec 16 13:24:18 crc kubenswrapper[4757]: I1216 13:24:18.594265 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4c2d838-cc46-4457-9b88-5ea6eb7f14e4-kube-api-access-gnqc8" (OuterVolumeSpecName: "kube-api-access-gnqc8") pod "f4c2d838-cc46-4457-9b88-5ea6eb7f14e4" (UID: "f4c2d838-cc46-4457-9b88-5ea6eb7f14e4"). InnerVolumeSpecName "kube-api-access-gnqc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:24:18 crc kubenswrapper[4757]: I1216 13:24:18.612729 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4c2d838-cc46-4457-9b88-5ea6eb7f14e4-inventory" (OuterVolumeSpecName: "inventory") pod "f4c2d838-cc46-4457-9b88-5ea6eb7f14e4" (UID: "f4c2d838-cc46-4457-9b88-5ea6eb7f14e4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:24:18 crc kubenswrapper[4757]: I1216 13:24:18.615686 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4c2d838-cc46-4457-9b88-5ea6eb7f14e4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f4c2d838-cc46-4457-9b88-5ea6eb7f14e4" (UID: "f4c2d838-cc46-4457-9b88-5ea6eb7f14e4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:24:18 crc kubenswrapper[4757]: I1216 13:24:18.689763 4757 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f4c2d838-cc46-4457-9b88-5ea6eb7f14e4-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 13:24:18 crc kubenswrapper[4757]: I1216 13:24:18.689810 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnqc8\" (UniqueName: \"kubernetes.io/projected/f4c2d838-cc46-4457-9b88-5ea6eb7f14e4-kube-api-access-gnqc8\") on node \"crc\" DevicePath \"\"" Dec 16 13:24:18 crc kubenswrapper[4757]: I1216 13:24:18.689821 4757 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f4c2d838-cc46-4457-9b88-5ea6eb7f14e4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.091283 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9" event={"ID":"f4c2d838-cc46-4457-9b88-5ea6eb7f14e4","Type":"ContainerDied","Data":"244c675aa9e2fa27def94e946e8fede04b5cfdf961004bf1de10b9555d6ad45f"} Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.091321 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="244c675aa9e2fa27def94e946e8fede04b5cfdf961004bf1de10b9555d6ad45f" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.091361 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.189575 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt"] Dec 16 13:24:19 crc kubenswrapper[4757]: E1216 13:24:19.190344 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c2d838-cc46-4457-9b88-5ea6eb7f14e4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.190364 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c2d838-cc46-4457-9b88-5ea6eb7f14e4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.190571 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4c2d838-cc46-4457-9b88-5ea6eb7f14e4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.191233 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.194584 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.194881 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.197186 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.197185 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.197195 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.197592 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.197656 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.197745 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.208636 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt"] Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.315016 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.315090 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.315134 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.315165 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.315189 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.315214 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.315254 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.315278 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.315305 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.315361 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bdnk\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-kube-api-access-8bdnk\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.315402 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.315458 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.315509 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.315570 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.417128 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.417189 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.417230 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.417269 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.417308 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.417338 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.417370 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.417391 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.417419 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.417472 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bdnk\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-kube-api-access-8bdnk\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.417507 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.417552 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.417609 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.417643 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.422824 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.422928 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.422988 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.424405 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.424811 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.426111 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.426237 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.426518 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.427369 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.428899 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.429760 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.430041 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.430746 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.445641 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bdnk\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-kube-api-access-8bdnk\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:19 crc kubenswrapper[4757]: I1216 13:24:19.511348 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:24:20 crc kubenswrapper[4757]: I1216 13:24:20.049387 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt"] Dec 16 13:24:20 crc kubenswrapper[4757]: I1216 13:24:20.100386 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" event={"ID":"0b943ec1-dc21-47ab-832a-d6f68f3ac17f","Type":"ContainerStarted","Data":"66f7f7529ac4e6f9608903a422b89bb2e3fbc7fcdde5c685e707474e939a9437"} Dec 16 13:24:21 crc kubenswrapper[4757]: I1216 13:24:21.110636 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" event={"ID":"0b943ec1-dc21-47ab-832a-d6f68f3ac17f","Type":"ContainerStarted","Data":"306d9eb42a57c199b3bbddd4a7a734a38ce5f41ead8b7c35414bf88a7d48b4bf"} Dec 16 13:24:21 crc kubenswrapper[4757]: I1216 13:24:21.135229 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" podStartSLOduration=1.638073186 podStartE2EDuration="2.135206902s" podCreationTimestamp="2025-12-16 13:24:19 +0000 UTC" firstStartedPulling="2025-12-16 13:24:20.058900733 +0000 UTC m=+2245.486644529" lastFinishedPulling="2025-12-16 13:24:20.556034449 +0000 UTC m=+2245.983778245" observedRunningTime="2025-12-16 13:24:21.130420395 +0000 UTC m=+2246.558164211" watchObservedRunningTime="2025-12-16 13:24:21.135206902 +0000 UTC m=+2246.562950688" Dec 16 13:24:25 crc kubenswrapper[4757]: I1216 13:24:25.971154 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:24:25 crc kubenswrapper[4757]: E1216 13:24:25.972145 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:24:36 crc kubenswrapper[4757]: I1216 13:24:36.949049 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:24:36 crc kubenswrapper[4757]: E1216 13:24:36.949815 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:24:49 crc kubenswrapper[4757]: I1216 13:24:49.949241 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:24:49 crc kubenswrapper[4757]: E1216 13:24:49.950248 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:25:01 crc kubenswrapper[4757]: I1216 13:25:01.430324 4757 generic.go:334] "Generic (PLEG): container finished" podID="0b943ec1-dc21-47ab-832a-d6f68f3ac17f" containerID="306d9eb42a57c199b3bbddd4a7a734a38ce5f41ead8b7c35414bf88a7d48b4bf" exitCode=0 Dec 16 13:25:01 crc kubenswrapper[4757]: I1216 13:25:01.430387 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" event={"ID":"0b943ec1-dc21-47ab-832a-d6f68f3ac17f","Type":"ContainerDied","Data":"306d9eb42a57c199b3bbddd4a7a734a38ce5f41ead8b7c35414bf88a7d48b4bf"} Dec 16 13:25:01 crc kubenswrapper[4757]: I1216 13:25:01.949355 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:25:01 crc kubenswrapper[4757]: E1216 13:25:01.949613 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.863866 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.971635 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-bootstrap-combined-ca-bundle\") pod \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.971694 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-ssh-key\") pod \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.971775 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-ovn-combined-ca-bundle\") pod \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.971818 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-libvirt-combined-ca-bundle\") pod \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.971868 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-nova-combined-ca-bundle\") pod \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.971924 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-telemetry-combined-ca-bundle\") pod \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.971990 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-inventory\") pod \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.972118 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-neutron-metadata-combined-ca-bundle\") pod \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.972165 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.972193 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.972245 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.972279 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bdnk\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-kube-api-access-8bdnk\") pod \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.972319 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-repo-setup-combined-ca-bundle\") pod \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.972362 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\" (UID: \"0b943ec1-dc21-47ab-832a-d6f68f3ac17f\") " Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.979996 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0b943ec1-dc21-47ab-832a-d6f68f3ac17f" (UID: "0b943ec1-dc21-47ab-832a-d6f68f3ac17f"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.980617 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "0b943ec1-dc21-47ab-832a-d6f68f3ac17f" (UID: "0b943ec1-dc21-47ab-832a-d6f68f3ac17f"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.981559 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0b943ec1-dc21-47ab-832a-d6f68f3ac17f" (UID: "0b943ec1-dc21-47ab-832a-d6f68f3ac17f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.985057 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0b943ec1-dc21-47ab-832a-d6f68f3ac17f" (UID: "0b943ec1-dc21-47ab-832a-d6f68f3ac17f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.987682 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-kube-api-access-8bdnk" (OuterVolumeSpecName: "kube-api-access-8bdnk") pod "0b943ec1-dc21-47ab-832a-d6f68f3ac17f" (UID: "0b943ec1-dc21-47ab-832a-d6f68f3ac17f"). InnerVolumeSpecName "kube-api-access-8bdnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.987948 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "0b943ec1-dc21-47ab-832a-d6f68f3ac17f" (UID: "0b943ec1-dc21-47ab-832a-d6f68f3ac17f"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.987972 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "0b943ec1-dc21-47ab-832a-d6f68f3ac17f" (UID: "0b943ec1-dc21-47ab-832a-d6f68f3ac17f"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.994121 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0b943ec1-dc21-47ab-832a-d6f68f3ac17f" (UID: "0b943ec1-dc21-47ab-832a-d6f68f3ac17f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.994503 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0b943ec1-dc21-47ab-832a-d6f68f3ac17f" (UID: "0b943ec1-dc21-47ab-832a-d6f68f3ac17f"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.997054 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "0b943ec1-dc21-47ab-832a-d6f68f3ac17f" (UID: "0b943ec1-dc21-47ab-832a-d6f68f3ac17f"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:25:02 crc kubenswrapper[4757]: I1216 13:25:02.999239 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "0b943ec1-dc21-47ab-832a-d6f68f3ac17f" (UID: "0b943ec1-dc21-47ab-832a-d6f68f3ac17f"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.022238 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0b943ec1-dc21-47ab-832a-d6f68f3ac17f" (UID: "0b943ec1-dc21-47ab-832a-d6f68f3ac17f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.025797 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "0b943ec1-dc21-47ab-832a-d6f68f3ac17f" (UID: "0b943ec1-dc21-47ab-832a-d6f68f3ac17f"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.030317 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-inventory" (OuterVolumeSpecName: "inventory") pod "0b943ec1-dc21-47ab-832a-d6f68f3ac17f" (UID: "0b943ec1-dc21-47ab-832a-d6f68f3ac17f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.075197 4757 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.075254 4757 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.075273 4757 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.075292 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bdnk\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-kube-api-access-8bdnk\") on node \"crc\" DevicePath \"\"" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.075306 4757 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.075324 4757 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.075342 4757 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.075357 4757 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.075371 4757 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.075383 4757 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.075397 4757 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.075411 4757 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.075423 4757 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.075435 4757 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b943ec1-dc21-47ab-832a-d6f68f3ac17f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.449833 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" event={"ID":"0b943ec1-dc21-47ab-832a-d6f68f3ac17f","Type":"ContainerDied","Data":"66f7f7529ac4e6f9608903a422b89bb2e3fbc7fcdde5c685e707474e939a9437"} Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.449893 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.449911 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66f7f7529ac4e6f9608903a422b89bb2e3fbc7fcdde5c685e707474e939a9437" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.570282 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6"] Dec 16 13:25:03 crc kubenswrapper[4757]: E1216 13:25:03.570646 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b943ec1-dc21-47ab-832a-d6f68f3ac17f" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.570666 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b943ec1-dc21-47ab-832a-d6f68f3ac17f" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.570837 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b943ec1-dc21-47ab-832a-d6f68f3ac17f" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.571511 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.573255 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.573677 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.574211 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.581776 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.582442 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.594321 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6"] Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.686033 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e4dfc5-8085-4270-847a-a36c8194b383-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jj4t6\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.686089 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nnrf\" (UniqueName: \"kubernetes.io/projected/85e4dfc5-8085-4270-847a-a36c8194b383-kube-api-access-9nnrf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jj4t6\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.686118 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/85e4dfc5-8085-4270-847a-a36c8194b383-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jj4t6\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.686136 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85e4dfc5-8085-4270-847a-a36c8194b383-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jj4t6\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.686195 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85e4dfc5-8085-4270-847a-a36c8194b383-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jj4t6\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.788109 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/85e4dfc5-8085-4270-847a-a36c8194b383-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jj4t6\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.788621 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85e4dfc5-8085-4270-847a-a36c8194b383-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jj4t6\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.788734 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85e4dfc5-8085-4270-847a-a36c8194b383-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jj4t6\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.788877 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e4dfc5-8085-4270-847a-a36c8194b383-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jj4t6\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.788955 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nnrf\" (UniqueName: \"kubernetes.io/projected/85e4dfc5-8085-4270-847a-a36c8194b383-kube-api-access-9nnrf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jj4t6\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.790101 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/85e4dfc5-8085-4270-847a-a36c8194b383-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jj4t6\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.793817 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e4dfc5-8085-4270-847a-a36c8194b383-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jj4t6\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.794595 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85e4dfc5-8085-4270-847a-a36c8194b383-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jj4t6\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.797601 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85e4dfc5-8085-4270-847a-a36c8194b383-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jj4t6\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.808210 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nnrf\" (UniqueName: \"kubernetes.io/projected/85e4dfc5-8085-4270-847a-a36c8194b383-kube-api-access-9nnrf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jj4t6\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:25:03 crc kubenswrapper[4757]: I1216 13:25:03.902884 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:25:04 crc kubenswrapper[4757]: I1216 13:25:04.437131 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6"] Dec 16 13:25:04 crc kubenswrapper[4757]: I1216 13:25:04.458272 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" event={"ID":"85e4dfc5-8085-4270-847a-a36c8194b383","Type":"ContainerStarted","Data":"2f0bd0e8946254e5dcdecc5ca5b4d624e250dda077a0ed449f59fdb0b08e73e1"} Dec 16 13:25:05 crc kubenswrapper[4757]: I1216 13:25:05.473109 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" event={"ID":"85e4dfc5-8085-4270-847a-a36c8194b383","Type":"ContainerStarted","Data":"275fe81628e174d22ec4bba4b080d77187e72b9f34705f675f89d4bec34d97fd"} Dec 16 13:25:05 crc kubenswrapper[4757]: I1216 13:25:05.498459 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" podStartSLOduration=2.017355278 podStartE2EDuration="2.498440052s" podCreationTimestamp="2025-12-16 13:25:03 +0000 UTC" firstStartedPulling="2025-12-16 13:25:04.443082855 +0000 UTC m=+2289.870826651" lastFinishedPulling="2025-12-16 13:25:04.924167629 +0000 UTC m=+2290.351911425" observedRunningTime="2025-12-16 13:25:05.49509682 +0000 UTC m=+2290.922840626" watchObservedRunningTime="2025-12-16 13:25:05.498440052 +0000 UTC m=+2290.926183848" Dec 16 13:25:15 crc kubenswrapper[4757]: I1216 13:25:15.948466 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:25:15 crc kubenswrapper[4757]: E1216 13:25:15.949264 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:25:30 crc kubenswrapper[4757]: I1216 13:25:30.953350 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:25:30 crc kubenswrapper[4757]: E1216 13:25:30.954195 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:25:43 crc kubenswrapper[4757]: I1216 13:25:43.948798 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:25:43 crc kubenswrapper[4757]: E1216 13:25:43.949685 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:25:54 crc kubenswrapper[4757]: I1216 13:25:54.956849 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:25:54 crc kubenswrapper[4757]: E1216 13:25:54.958057 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:26:06 crc kubenswrapper[4757]: I1216 13:26:06.949037 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:26:06 crc kubenswrapper[4757]: E1216 13:26:06.949853 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:26:17 crc kubenswrapper[4757]: I1216 13:26:17.948787 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:26:17 crc kubenswrapper[4757]: E1216 13:26:17.949526 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:26:22 crc kubenswrapper[4757]: I1216 13:26:22.114122 4757 generic.go:334] "Generic (PLEG): container finished" podID="85e4dfc5-8085-4270-847a-a36c8194b383" containerID="275fe81628e174d22ec4bba4b080d77187e72b9f34705f675f89d4bec34d97fd" exitCode=0 Dec 16 13:26:22 crc kubenswrapper[4757]: I1216 13:26:22.114638 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" event={"ID":"85e4dfc5-8085-4270-847a-a36c8194b383","Type":"ContainerDied","Data":"275fe81628e174d22ec4bba4b080d77187e72b9f34705f675f89d4bec34d97fd"} Dec 16 13:26:23 crc kubenswrapper[4757]: I1216 13:26:23.531645 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:26:23 crc kubenswrapper[4757]: I1216 13:26:23.724527 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nnrf\" (UniqueName: \"kubernetes.io/projected/85e4dfc5-8085-4270-847a-a36c8194b383-kube-api-access-9nnrf\") pod \"85e4dfc5-8085-4270-847a-a36c8194b383\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " Dec 16 13:26:23 crc kubenswrapper[4757]: I1216 13:26:23.724970 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85e4dfc5-8085-4270-847a-a36c8194b383-inventory\") pod \"85e4dfc5-8085-4270-847a-a36c8194b383\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " Dec 16 13:26:23 crc kubenswrapper[4757]: I1216 13:26:23.725222 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e4dfc5-8085-4270-847a-a36c8194b383-ovn-combined-ca-bundle\") pod \"85e4dfc5-8085-4270-847a-a36c8194b383\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " Dec 16 13:26:23 crc kubenswrapper[4757]: I1216 13:26:23.725277 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/85e4dfc5-8085-4270-847a-a36c8194b383-ovncontroller-config-0\") pod \"85e4dfc5-8085-4270-847a-a36c8194b383\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " Dec 16 13:26:23 crc kubenswrapper[4757]: I1216 13:26:23.725373 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85e4dfc5-8085-4270-847a-a36c8194b383-ssh-key\") pod \"85e4dfc5-8085-4270-847a-a36c8194b383\" (UID: \"85e4dfc5-8085-4270-847a-a36c8194b383\") " Dec 16 13:26:23 crc kubenswrapper[4757]: I1216 13:26:23.732190 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e4dfc5-8085-4270-847a-a36c8194b383-kube-api-access-9nnrf" (OuterVolumeSpecName: "kube-api-access-9nnrf") pod "85e4dfc5-8085-4270-847a-a36c8194b383" (UID: "85e4dfc5-8085-4270-847a-a36c8194b383"). InnerVolumeSpecName "kube-api-access-9nnrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:26:23 crc kubenswrapper[4757]: I1216 13:26:23.741119 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e4dfc5-8085-4270-847a-a36c8194b383-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "85e4dfc5-8085-4270-847a-a36c8194b383" (UID: "85e4dfc5-8085-4270-847a-a36c8194b383"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:26:23 crc kubenswrapper[4757]: I1216 13:26:23.754520 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85e4dfc5-8085-4270-847a-a36c8194b383-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "85e4dfc5-8085-4270-847a-a36c8194b383" (UID: "85e4dfc5-8085-4270-847a-a36c8194b383"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:26:23 crc kubenswrapper[4757]: I1216 13:26:23.754623 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e4dfc5-8085-4270-847a-a36c8194b383-inventory" (OuterVolumeSpecName: "inventory") pod "85e4dfc5-8085-4270-847a-a36c8194b383" (UID: "85e4dfc5-8085-4270-847a-a36c8194b383"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:26:23 crc kubenswrapper[4757]: I1216 13:26:23.777121 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e4dfc5-8085-4270-847a-a36c8194b383-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "85e4dfc5-8085-4270-847a-a36c8194b383" (UID: "85e4dfc5-8085-4270-847a-a36c8194b383"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:26:23 crc kubenswrapper[4757]: I1216 13:26:23.827699 4757 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e4dfc5-8085-4270-847a-a36c8194b383-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:26:23 crc kubenswrapper[4757]: I1216 13:26:23.827733 4757 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/85e4dfc5-8085-4270-847a-a36c8194b383-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:26:23 crc kubenswrapper[4757]: I1216 13:26:23.827743 4757 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85e4dfc5-8085-4270-847a-a36c8194b383-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:26:23 crc kubenswrapper[4757]: I1216 13:26:23.827752 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nnrf\" (UniqueName: \"kubernetes.io/projected/85e4dfc5-8085-4270-847a-a36c8194b383-kube-api-access-9nnrf\") on node \"crc\" DevicePath \"\"" Dec 16 13:26:23 crc kubenswrapper[4757]: I1216 13:26:23.827761 4757 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85e4dfc5-8085-4270-847a-a36c8194b383-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.132096 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" event={"ID":"85e4dfc5-8085-4270-847a-a36c8194b383","Type":"ContainerDied","Data":"2f0bd0e8946254e5dcdecc5ca5b4d624e250dda077a0ed449f59fdb0b08e73e1"} Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.132146 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f0bd0e8946254e5dcdecc5ca5b4d624e250dda077a0ed449f59fdb0b08e73e1" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.132176 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jj4t6" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.246513 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl"] Dec 16 13:26:24 crc kubenswrapper[4757]: E1216 13:26:24.246860 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e4dfc5-8085-4270-847a-a36c8194b383" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.246878 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e4dfc5-8085-4270-847a-a36c8194b383" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.247110 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e4dfc5-8085-4270-847a-a36c8194b383" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.247674 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.250542 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.251832 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.252114 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.254952 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.256410 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.259492 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.269498 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl"] Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.439327 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.439589 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swb87\" (UniqueName: \"kubernetes.io/projected/ad61ff87-21a4-4583-83b4-65c2253f2993-kube-api-access-swb87\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.439625 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.439801 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.439890 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.439973 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.541507 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.541558 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swb87\" (UniqueName: \"kubernetes.io/projected/ad61ff87-21a4-4583-83b4-65c2253f2993-kube-api-access-swb87\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.541601 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.541696 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.541758 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.541890 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.546258 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.547299 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.547768 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.548492 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.550200 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.561823 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swb87\" (UniqueName: \"kubernetes.io/projected/ad61ff87-21a4-4583-83b4-65c2253f2993-kube-api-access-swb87\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:24 crc kubenswrapper[4757]: I1216 13:26:24.564699 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:26:25 crc kubenswrapper[4757]: I1216 13:26:25.121567 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl"] Dec 16 13:26:25 crc kubenswrapper[4757]: I1216 13:26:25.131991 4757 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 13:26:25 crc kubenswrapper[4757]: I1216 13:26:25.143956 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" event={"ID":"ad61ff87-21a4-4583-83b4-65c2253f2993","Type":"ContainerStarted","Data":"c334126afa6750797103a068e77903281cf022b9411b08b5c472c23b35552e19"} Dec 16 13:26:26 crc kubenswrapper[4757]: I1216 13:26:26.156426 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" event={"ID":"ad61ff87-21a4-4583-83b4-65c2253f2993","Type":"ContainerStarted","Data":"ead63f389d73c87ff4c98d319ffb060b9e0062e13816b73ca5fb0bb4945b088d"} Dec 16 13:26:26 crc kubenswrapper[4757]: I1216 13:26:26.174801 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" podStartSLOduration=1.519180441 podStartE2EDuration="2.174781452s" podCreationTimestamp="2025-12-16 13:26:24 +0000 UTC" firstStartedPulling="2025-12-16 13:26:25.131721616 +0000 UTC m=+2370.559465412" lastFinishedPulling="2025-12-16 13:26:25.787322627 +0000 UTC m=+2371.215066423" observedRunningTime="2025-12-16 13:26:26.172175748 +0000 UTC m=+2371.599919554" watchObservedRunningTime="2025-12-16 13:26:26.174781452 +0000 UTC m=+2371.602525248" Dec 16 13:26:29 crc kubenswrapper[4757]: I1216 13:26:29.949807 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:26:29 crc kubenswrapper[4757]: E1216 13:26:29.951385 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:26:43 crc kubenswrapper[4757]: I1216 13:26:43.948737 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:26:43 crc kubenswrapper[4757]: E1216 13:26:43.949559 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:26:55 crc kubenswrapper[4757]: I1216 13:26:55.948614 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:26:55 crc kubenswrapper[4757]: E1216 13:26:55.949462 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:27:07 crc kubenswrapper[4757]: I1216 13:27:07.948714 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:27:07 crc kubenswrapper[4757]: E1216 13:27:07.949449 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:27:22 crc kubenswrapper[4757]: I1216 13:27:22.949504 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:27:22 crc kubenswrapper[4757]: E1216 13:27:22.950352 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:27:25 crc kubenswrapper[4757]: E1216 13:27:25.140394 4757 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad61ff87_21a4_4583_83b4_65c2253f2993.slice/crio-ead63f389d73c87ff4c98d319ffb060b9e0062e13816b73ca5fb0bb4945b088d.scope\": RecentStats: unable to find data in memory cache]" Dec 16 13:27:25 crc kubenswrapper[4757]: I1216 13:27:25.667217 4757 generic.go:334] "Generic (PLEG): container finished" podID="ad61ff87-21a4-4583-83b4-65c2253f2993" containerID="ead63f389d73c87ff4c98d319ffb060b9e0062e13816b73ca5fb0bb4945b088d" exitCode=0 Dec 16 13:27:25 crc kubenswrapper[4757]: I1216 13:27:25.667267 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" event={"ID":"ad61ff87-21a4-4583-83b4-65c2253f2993","Type":"ContainerDied","Data":"ead63f389d73c87ff4c98d319ffb060b9e0062e13816b73ca5fb0bb4945b088d"} Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.064691 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.188143 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-neutron-metadata-combined-ca-bundle\") pod \"ad61ff87-21a4-4583-83b4-65c2253f2993\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.188185 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-nova-metadata-neutron-config-0\") pod \"ad61ff87-21a4-4583-83b4-65c2253f2993\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.188288 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-inventory\") pod \"ad61ff87-21a4-4583-83b4-65c2253f2993\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.188380 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-neutron-ovn-metadata-agent-neutron-config-0\") pod \"ad61ff87-21a4-4583-83b4-65c2253f2993\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.188422 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-ssh-key\") pod \"ad61ff87-21a4-4583-83b4-65c2253f2993\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.188498 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swb87\" (UniqueName: \"kubernetes.io/projected/ad61ff87-21a4-4583-83b4-65c2253f2993-kube-api-access-swb87\") pod \"ad61ff87-21a4-4583-83b4-65c2253f2993\" (UID: \"ad61ff87-21a4-4583-83b4-65c2253f2993\") " Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.195589 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ad61ff87-21a4-4583-83b4-65c2253f2993" (UID: "ad61ff87-21a4-4583-83b4-65c2253f2993"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.195603 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad61ff87-21a4-4583-83b4-65c2253f2993-kube-api-access-swb87" (OuterVolumeSpecName: "kube-api-access-swb87") pod "ad61ff87-21a4-4583-83b4-65c2253f2993" (UID: "ad61ff87-21a4-4583-83b4-65c2253f2993"). InnerVolumeSpecName "kube-api-access-swb87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.217405 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-inventory" (OuterVolumeSpecName: "inventory") pod "ad61ff87-21a4-4583-83b4-65c2253f2993" (UID: "ad61ff87-21a4-4583-83b4-65c2253f2993"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.218051 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ad61ff87-21a4-4583-83b4-65c2253f2993" (UID: "ad61ff87-21a4-4583-83b4-65c2253f2993"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.218465 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "ad61ff87-21a4-4583-83b4-65c2253f2993" (UID: "ad61ff87-21a4-4583-83b4-65c2253f2993"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.219579 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "ad61ff87-21a4-4583-83b4-65c2253f2993" (UID: "ad61ff87-21a4-4583-83b4-65c2253f2993"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.291103 4757 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.291129 4757 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.291139 4757 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.291151 4757 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.291162 4757 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad61ff87-21a4-4583-83b4-65c2253f2993-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.291173 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swb87\" (UniqueName: \"kubernetes.io/projected/ad61ff87-21a4-4583-83b4-65c2253f2993-kube-api-access-swb87\") on node \"crc\" DevicePath \"\"" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.688168 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" event={"ID":"ad61ff87-21a4-4583-83b4-65c2253f2993","Type":"ContainerDied","Data":"c334126afa6750797103a068e77903281cf022b9411b08b5c472c23b35552e19"} Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.688229 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c334126afa6750797103a068e77903281cf022b9411b08b5c472c23b35552e19" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.688197 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.795255 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp"] Dec 16 13:27:27 crc kubenswrapper[4757]: E1216 13:27:27.795639 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad61ff87-21a4-4583-83b4-65c2253f2993" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.795664 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad61ff87-21a4-4583-83b4-65c2253f2993" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.795846 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad61ff87-21a4-4583-83b4-65c2253f2993" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.796564 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.799798 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.800102 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.800440 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.800582 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.800807 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.809953 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp"] Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.903412 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwdj4\" (UniqueName: \"kubernetes.io/projected/d146c06e-d73a-47a2-8e1f-07ca485b1a72-kube-api-access-hwdj4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.903553 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.903590 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.903668 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:27:27 crc kubenswrapper[4757]: I1216 13:27:27.903707 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:27:28 crc kubenswrapper[4757]: I1216 13:27:28.005373 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:27:28 crc kubenswrapper[4757]: I1216 13:27:28.005652 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:27:28 crc kubenswrapper[4757]: I1216 13:27:28.005832 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:27:28 crc kubenswrapper[4757]: I1216 13:27:28.005976 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:27:28 crc kubenswrapper[4757]: I1216 13:27:28.006157 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwdj4\" (UniqueName: \"kubernetes.io/projected/d146c06e-d73a-47a2-8e1f-07ca485b1a72-kube-api-access-hwdj4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:27:28 crc kubenswrapper[4757]: I1216 13:27:28.009739 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:27:28 crc kubenswrapper[4757]: I1216 13:27:28.009743 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:27:28 crc kubenswrapper[4757]: I1216 13:27:28.011223 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:27:28 crc kubenswrapper[4757]: I1216 13:27:28.023459 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:27:28 crc kubenswrapper[4757]: I1216 13:27:28.025185 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwdj4\" (UniqueName: \"kubernetes.io/projected/d146c06e-d73a-47a2-8e1f-07ca485b1a72-kube-api-access-hwdj4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:27:28 crc kubenswrapper[4757]: I1216 13:27:28.121577 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:27:28 crc kubenswrapper[4757]: I1216 13:27:28.620563 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp"] Dec 16 13:27:28 crc kubenswrapper[4757]: I1216 13:27:28.697707 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" event={"ID":"d146c06e-d73a-47a2-8e1f-07ca485b1a72","Type":"ContainerStarted","Data":"3eae308130bf12c6adc09a0f50769fbd626582d329a8d7475345c9ea81290504"} Dec 16 13:27:29 crc kubenswrapper[4757]: I1216 13:27:29.709791 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" event={"ID":"d146c06e-d73a-47a2-8e1f-07ca485b1a72","Type":"ContainerStarted","Data":"4afe571b537e77902bc8c0a3d28d9232ccf2b91987d9ebd536d0f94e56ddd733"} Dec 16 13:27:29 crc kubenswrapper[4757]: I1216 13:27:29.739572 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" podStartSLOduration=2.206411839 podStartE2EDuration="2.739541105s" podCreationTimestamp="2025-12-16 13:27:27 +0000 UTC" firstStartedPulling="2025-12-16 13:27:28.625643847 +0000 UTC m=+2434.053387643" lastFinishedPulling="2025-12-16 13:27:29.158773113 +0000 UTC m=+2434.586516909" observedRunningTime="2025-12-16 13:27:29.729264783 +0000 UTC m=+2435.157008609" watchObservedRunningTime="2025-12-16 13:27:29.739541105 +0000 UTC m=+2435.167284921" Dec 16 13:27:34 crc kubenswrapper[4757]: I1216 13:27:34.968736 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:27:34 crc kubenswrapper[4757]: E1216 13:27:34.969604 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:27:47 crc kubenswrapper[4757]: I1216 13:27:47.950526 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:27:47 crc kubenswrapper[4757]: E1216 13:27:47.951589 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:27:59 crc kubenswrapper[4757]: I1216 13:27:59.948960 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:27:59 crc kubenswrapper[4757]: E1216 13:27:59.949747 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:28:14 crc kubenswrapper[4757]: I1216 13:28:14.954281 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:28:14 crc kubenswrapper[4757]: E1216 13:28:14.954974 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:28:28 crc kubenswrapper[4757]: I1216 13:28:28.949433 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:28:29 crc kubenswrapper[4757]: I1216 13:28:29.190372 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"37a145676157cb0ad283f5e08ca036f53f30097a029365f1169b057c03bf3c30"} Dec 16 13:30:00 crc kubenswrapper[4757]: I1216 13:30:00.150222 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2"] Dec 16 13:30:00 crc kubenswrapper[4757]: I1216 13:30:00.154696 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2" Dec 16 13:30:00 crc kubenswrapper[4757]: I1216 13:30:00.159416 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 13:30:00 crc kubenswrapper[4757]: I1216 13:30:00.159611 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 13:30:00 crc kubenswrapper[4757]: I1216 13:30:00.165102 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2"] Dec 16 13:30:00 crc kubenswrapper[4757]: I1216 13:30:00.214393 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7278c05d-34ac-48b0-9c9f-14b6ac22d900-secret-volume\") pod \"collect-profiles-29431530-wc7w2\" (UID: \"7278c05d-34ac-48b0-9c9f-14b6ac22d900\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2" Dec 16 13:30:00 crc kubenswrapper[4757]: I1216 13:30:00.214544 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm2b7\" (UniqueName: \"kubernetes.io/projected/7278c05d-34ac-48b0-9c9f-14b6ac22d900-kube-api-access-lm2b7\") pod \"collect-profiles-29431530-wc7w2\" (UID: \"7278c05d-34ac-48b0-9c9f-14b6ac22d900\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2" Dec 16 13:30:00 crc kubenswrapper[4757]: I1216 13:30:00.214750 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7278c05d-34ac-48b0-9c9f-14b6ac22d900-config-volume\") pod \"collect-profiles-29431530-wc7w2\" (UID: \"7278c05d-34ac-48b0-9c9f-14b6ac22d900\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2" Dec 16 13:30:00 crc kubenswrapper[4757]: I1216 13:30:00.316447 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7278c05d-34ac-48b0-9c9f-14b6ac22d900-secret-volume\") pod \"collect-profiles-29431530-wc7w2\" (UID: \"7278c05d-34ac-48b0-9c9f-14b6ac22d900\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2" Dec 16 13:30:00 crc kubenswrapper[4757]: I1216 13:30:00.316779 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm2b7\" (UniqueName: \"kubernetes.io/projected/7278c05d-34ac-48b0-9c9f-14b6ac22d900-kube-api-access-lm2b7\") pod \"collect-profiles-29431530-wc7w2\" (UID: \"7278c05d-34ac-48b0-9c9f-14b6ac22d900\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2" Dec 16 13:30:00 crc kubenswrapper[4757]: I1216 13:30:00.316980 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7278c05d-34ac-48b0-9c9f-14b6ac22d900-config-volume\") pod \"collect-profiles-29431530-wc7w2\" (UID: \"7278c05d-34ac-48b0-9c9f-14b6ac22d900\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2" Dec 16 13:30:00 crc kubenswrapper[4757]: I1216 13:30:00.317884 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7278c05d-34ac-48b0-9c9f-14b6ac22d900-config-volume\") pod \"collect-profiles-29431530-wc7w2\" (UID: \"7278c05d-34ac-48b0-9c9f-14b6ac22d900\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2" Dec 16 13:30:00 crc kubenswrapper[4757]: I1216 13:30:00.323904 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7278c05d-34ac-48b0-9c9f-14b6ac22d900-secret-volume\") pod \"collect-profiles-29431530-wc7w2\" (UID: \"7278c05d-34ac-48b0-9c9f-14b6ac22d900\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2" Dec 16 13:30:00 crc kubenswrapper[4757]: I1216 13:30:00.340258 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm2b7\" (UniqueName: \"kubernetes.io/projected/7278c05d-34ac-48b0-9c9f-14b6ac22d900-kube-api-access-lm2b7\") pod \"collect-profiles-29431530-wc7w2\" (UID: \"7278c05d-34ac-48b0-9c9f-14b6ac22d900\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2" Dec 16 13:30:00 crc kubenswrapper[4757]: I1216 13:30:00.475582 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2" Dec 16 13:30:00 crc kubenswrapper[4757]: I1216 13:30:00.899662 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2"] Dec 16 13:30:00 crc kubenswrapper[4757]: W1216 13:30:00.908201 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7278c05d_34ac_48b0_9c9f_14b6ac22d900.slice/crio-680b73790a9ec162079e321a583c99380852f0b20d94c85311d754956ca8e718 WatchSource:0}: Error finding container 680b73790a9ec162079e321a583c99380852f0b20d94c85311d754956ca8e718: Status 404 returned error can't find the container with id 680b73790a9ec162079e321a583c99380852f0b20d94c85311d754956ca8e718 Dec 16 13:30:00 crc kubenswrapper[4757]: I1216 13:30:00.970181 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2" event={"ID":"7278c05d-34ac-48b0-9c9f-14b6ac22d900","Type":"ContainerStarted","Data":"680b73790a9ec162079e321a583c99380852f0b20d94c85311d754956ca8e718"} Dec 16 13:30:01 crc kubenswrapper[4757]: I1216 13:30:01.980157 4757 generic.go:334] "Generic (PLEG): container finished" podID="7278c05d-34ac-48b0-9c9f-14b6ac22d900" containerID="b879f1b69035121ff9466ac2c66f8183d417e50badaba0378f22f47678c5a7c3" exitCode=0 Dec 16 13:30:01 crc kubenswrapper[4757]: I1216 13:30:01.980491 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2" event={"ID":"7278c05d-34ac-48b0-9c9f-14b6ac22d900","Type":"ContainerDied","Data":"b879f1b69035121ff9466ac2c66f8183d417e50badaba0378f22f47678c5a7c3"} Dec 16 13:30:03 crc kubenswrapper[4757]: I1216 13:30:03.316542 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2" Dec 16 13:30:03 crc kubenswrapper[4757]: I1216 13:30:03.473758 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm2b7\" (UniqueName: \"kubernetes.io/projected/7278c05d-34ac-48b0-9c9f-14b6ac22d900-kube-api-access-lm2b7\") pod \"7278c05d-34ac-48b0-9c9f-14b6ac22d900\" (UID: \"7278c05d-34ac-48b0-9c9f-14b6ac22d900\") " Dec 16 13:30:03 crc kubenswrapper[4757]: I1216 13:30:03.473843 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7278c05d-34ac-48b0-9c9f-14b6ac22d900-config-volume\") pod \"7278c05d-34ac-48b0-9c9f-14b6ac22d900\" (UID: \"7278c05d-34ac-48b0-9c9f-14b6ac22d900\") " Dec 16 13:30:03 crc kubenswrapper[4757]: I1216 13:30:03.473916 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7278c05d-34ac-48b0-9c9f-14b6ac22d900-secret-volume\") pod \"7278c05d-34ac-48b0-9c9f-14b6ac22d900\" (UID: \"7278c05d-34ac-48b0-9c9f-14b6ac22d900\") " Dec 16 13:30:03 crc kubenswrapper[4757]: I1216 13:30:03.474907 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7278c05d-34ac-48b0-9c9f-14b6ac22d900-config-volume" (OuterVolumeSpecName: "config-volume") pod "7278c05d-34ac-48b0-9c9f-14b6ac22d900" (UID: "7278c05d-34ac-48b0-9c9f-14b6ac22d900"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:30:03 crc kubenswrapper[4757]: I1216 13:30:03.479663 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7278c05d-34ac-48b0-9c9f-14b6ac22d900-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7278c05d-34ac-48b0-9c9f-14b6ac22d900" (UID: "7278c05d-34ac-48b0-9c9f-14b6ac22d900"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:30:03 crc kubenswrapper[4757]: I1216 13:30:03.479839 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7278c05d-34ac-48b0-9c9f-14b6ac22d900-kube-api-access-lm2b7" (OuterVolumeSpecName: "kube-api-access-lm2b7") pod "7278c05d-34ac-48b0-9c9f-14b6ac22d900" (UID: "7278c05d-34ac-48b0-9c9f-14b6ac22d900"). InnerVolumeSpecName "kube-api-access-lm2b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:30:03 crc kubenswrapper[4757]: I1216 13:30:03.575556 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm2b7\" (UniqueName: \"kubernetes.io/projected/7278c05d-34ac-48b0-9c9f-14b6ac22d900-kube-api-access-lm2b7\") on node \"crc\" DevicePath \"\"" Dec 16 13:30:03 crc kubenswrapper[4757]: I1216 13:30:03.575587 4757 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7278c05d-34ac-48b0-9c9f-14b6ac22d900-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 13:30:03 crc kubenswrapper[4757]: I1216 13:30:03.575597 4757 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7278c05d-34ac-48b0-9c9f-14b6ac22d900-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 13:30:04 crc kubenswrapper[4757]: I1216 13:30:04.010624 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2" event={"ID":"7278c05d-34ac-48b0-9c9f-14b6ac22d900","Type":"ContainerDied","Data":"680b73790a9ec162079e321a583c99380852f0b20d94c85311d754956ca8e718"} Dec 16 13:30:04 crc kubenswrapper[4757]: I1216 13:30:04.010675 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="680b73790a9ec162079e321a583c99380852f0b20d94c85311d754956ca8e718" Dec 16 13:30:04 crc kubenswrapper[4757]: I1216 13:30:04.010697 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2" Dec 16 13:30:04 crc kubenswrapper[4757]: I1216 13:30:04.398462 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr"] Dec 16 13:30:04 crc kubenswrapper[4757]: I1216 13:30:04.405809 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431485-p5xjr"] Dec 16 13:30:04 crc kubenswrapper[4757]: I1216 13:30:04.963457 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d862c66e-b538-48e8-bbcb-a0cb2715a7de" path="/var/lib/kubelet/pods/d862c66e-b538-48e8-bbcb-a0cb2715a7de/volumes" Dec 16 13:30:30 crc kubenswrapper[4757]: I1216 13:30:30.803990 4757 scope.go:117] "RemoveContainer" containerID="22218cbb717092852de7d52ee5e3fcb2ec09dfcb7e9a9cd1ada61446d0c8efb4" Dec 16 13:30:51 crc kubenswrapper[4757]: I1216 13:30:51.181422 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:30:51 crc kubenswrapper[4757]: I1216 13:30:51.182243 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:31:21 crc kubenswrapper[4757]: I1216 13:31:21.181227 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:31:21 crc kubenswrapper[4757]: I1216 13:31:21.181855 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:31:32 crc kubenswrapper[4757]: I1216 13:31:32.602931 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5b94l"] Dec 16 13:31:32 crc kubenswrapper[4757]: E1216 13:31:32.605219 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7278c05d-34ac-48b0-9c9f-14b6ac22d900" containerName="collect-profiles" Dec 16 13:31:32 crc kubenswrapper[4757]: I1216 13:31:32.605334 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="7278c05d-34ac-48b0-9c9f-14b6ac22d900" containerName="collect-profiles" Dec 16 13:31:32 crc kubenswrapper[4757]: I1216 13:31:32.605654 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="7278c05d-34ac-48b0-9c9f-14b6ac22d900" containerName="collect-profiles" Dec 16 13:31:32 crc kubenswrapper[4757]: I1216 13:31:32.623453 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5b94l"] Dec 16 13:31:32 crc kubenswrapper[4757]: I1216 13:31:32.623608 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5b94l" Dec 16 13:31:32 crc kubenswrapper[4757]: I1216 13:31:32.675622 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f109383e-d114-4db2-8c9e-36d2b97189cb-catalog-content\") pod \"redhat-operators-5b94l\" (UID: \"f109383e-d114-4db2-8c9e-36d2b97189cb\") " pod="openshift-marketplace/redhat-operators-5b94l" Dec 16 13:31:32 crc kubenswrapper[4757]: I1216 13:31:32.676159 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f109383e-d114-4db2-8c9e-36d2b97189cb-utilities\") pod \"redhat-operators-5b94l\" (UID: \"f109383e-d114-4db2-8c9e-36d2b97189cb\") " pod="openshift-marketplace/redhat-operators-5b94l" Dec 16 13:31:32 crc kubenswrapper[4757]: I1216 13:31:32.676245 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp4fs\" (UniqueName: \"kubernetes.io/projected/f109383e-d114-4db2-8c9e-36d2b97189cb-kube-api-access-bp4fs\") pod \"redhat-operators-5b94l\" (UID: \"f109383e-d114-4db2-8c9e-36d2b97189cb\") " pod="openshift-marketplace/redhat-operators-5b94l" Dec 16 13:31:32 crc kubenswrapper[4757]: I1216 13:31:32.777834 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f109383e-d114-4db2-8c9e-36d2b97189cb-utilities\") pod \"redhat-operators-5b94l\" (UID: \"f109383e-d114-4db2-8c9e-36d2b97189cb\") " pod="openshift-marketplace/redhat-operators-5b94l" Dec 16 13:31:32 crc kubenswrapper[4757]: I1216 13:31:32.777949 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp4fs\" (UniqueName: \"kubernetes.io/projected/f109383e-d114-4db2-8c9e-36d2b97189cb-kube-api-access-bp4fs\") pod \"redhat-operators-5b94l\" (UID: \"f109383e-d114-4db2-8c9e-36d2b97189cb\") " pod="openshift-marketplace/redhat-operators-5b94l" Dec 16 13:31:32 crc kubenswrapper[4757]: I1216 13:31:32.778034 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f109383e-d114-4db2-8c9e-36d2b97189cb-catalog-content\") pod \"redhat-operators-5b94l\" (UID: \"f109383e-d114-4db2-8c9e-36d2b97189cb\") " pod="openshift-marketplace/redhat-operators-5b94l" Dec 16 13:31:32 crc kubenswrapper[4757]: I1216 13:31:32.778437 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f109383e-d114-4db2-8c9e-36d2b97189cb-utilities\") pod \"redhat-operators-5b94l\" (UID: \"f109383e-d114-4db2-8c9e-36d2b97189cb\") " pod="openshift-marketplace/redhat-operators-5b94l" Dec 16 13:31:32 crc kubenswrapper[4757]: I1216 13:31:32.778468 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f109383e-d114-4db2-8c9e-36d2b97189cb-catalog-content\") pod \"redhat-operators-5b94l\" (UID: \"f109383e-d114-4db2-8c9e-36d2b97189cb\") " pod="openshift-marketplace/redhat-operators-5b94l" Dec 16 13:31:32 crc kubenswrapper[4757]: I1216 13:31:32.806255 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp4fs\" (UniqueName: \"kubernetes.io/projected/f109383e-d114-4db2-8c9e-36d2b97189cb-kube-api-access-bp4fs\") pod \"redhat-operators-5b94l\" (UID: \"f109383e-d114-4db2-8c9e-36d2b97189cb\") " pod="openshift-marketplace/redhat-operators-5b94l" Dec 16 13:31:32 crc kubenswrapper[4757]: I1216 13:31:32.946732 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5b94l" Dec 16 13:31:33 crc kubenswrapper[4757]: I1216 13:31:33.600051 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5b94l"] Dec 16 13:31:33 crc kubenswrapper[4757]: I1216 13:31:33.763107 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5b94l" event={"ID":"f109383e-d114-4db2-8c9e-36d2b97189cb","Type":"ContainerStarted","Data":"c86ade33daf1f5cd9adb78a5094f33342d717c5dceb536af38d63fb857fcef13"} Dec 16 13:31:34 crc kubenswrapper[4757]: I1216 13:31:34.772551 4757 generic.go:334] "Generic (PLEG): container finished" podID="f109383e-d114-4db2-8c9e-36d2b97189cb" containerID="85b31240f2cd883c6d9f4544d5c7174db2a80c87cd5baba8c47e2f3c93892aef" exitCode=0 Dec 16 13:31:34 crc kubenswrapper[4757]: I1216 13:31:34.772602 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5b94l" event={"ID":"f109383e-d114-4db2-8c9e-36d2b97189cb","Type":"ContainerDied","Data":"85b31240f2cd883c6d9f4544d5c7174db2a80c87cd5baba8c47e2f3c93892aef"} Dec 16 13:31:34 crc kubenswrapper[4757]: I1216 13:31:34.774804 4757 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 13:31:35 crc kubenswrapper[4757]: I1216 13:31:35.782170 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5b94l" event={"ID":"f109383e-d114-4db2-8c9e-36d2b97189cb","Type":"ContainerStarted","Data":"e9709bd409d75995be1482ed5424d8c49721e73d95a109e70809b04b55ae7cbc"} Dec 16 13:31:39 crc kubenswrapper[4757]: I1216 13:31:39.814188 4757 generic.go:334] "Generic (PLEG): container finished" podID="f109383e-d114-4db2-8c9e-36d2b97189cb" containerID="e9709bd409d75995be1482ed5424d8c49721e73d95a109e70809b04b55ae7cbc" exitCode=0 Dec 16 13:31:39 crc kubenswrapper[4757]: I1216 13:31:39.814330 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5b94l" event={"ID":"f109383e-d114-4db2-8c9e-36d2b97189cb","Type":"ContainerDied","Data":"e9709bd409d75995be1482ed5424d8c49721e73d95a109e70809b04b55ae7cbc"} Dec 16 13:31:40 crc kubenswrapper[4757]: I1216 13:31:40.825450 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5b94l" event={"ID":"f109383e-d114-4db2-8c9e-36d2b97189cb","Type":"ContainerStarted","Data":"349c6e1da71803dc5cbdbc7b03f984be06fd442c6b23841b656d753848ab5e47"} Dec 16 13:31:40 crc kubenswrapper[4757]: I1216 13:31:40.857139 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5b94l" podStartSLOduration=3.241273603 podStartE2EDuration="8.857122489s" podCreationTimestamp="2025-12-16 13:31:32 +0000 UTC" firstStartedPulling="2025-12-16 13:31:34.774504639 +0000 UTC m=+2680.202248435" lastFinishedPulling="2025-12-16 13:31:40.390353525 +0000 UTC m=+2685.818097321" observedRunningTime="2025-12-16 13:31:40.852304131 +0000 UTC m=+2686.280047927" watchObservedRunningTime="2025-12-16 13:31:40.857122489 +0000 UTC m=+2686.284866285" Dec 16 13:31:42 crc kubenswrapper[4757]: I1216 13:31:42.947144 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5b94l" Dec 16 13:31:42 crc kubenswrapper[4757]: I1216 13:31:42.947656 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5b94l" Dec 16 13:31:43 crc kubenswrapper[4757]: I1216 13:31:43.997554 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5b94l" podUID="f109383e-d114-4db2-8c9e-36d2b97189cb" containerName="registry-server" probeResult="failure" output=< Dec 16 13:31:43 crc kubenswrapper[4757]: timeout: failed to connect service ":50051" within 1s Dec 16 13:31:43 crc kubenswrapper[4757]: > Dec 16 13:31:51 crc kubenswrapper[4757]: I1216 13:31:51.181400 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:31:51 crc kubenswrapper[4757]: I1216 13:31:51.181961 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:31:51 crc kubenswrapper[4757]: I1216 13:31:51.182032 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 13:31:51 crc kubenswrapper[4757]: I1216 13:31:51.182660 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37a145676157cb0ad283f5e08ca036f53f30097a029365f1169b057c03bf3c30"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 13:31:51 crc kubenswrapper[4757]: I1216 13:31:51.182704 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://37a145676157cb0ad283f5e08ca036f53f30097a029365f1169b057c03bf3c30" gracePeriod=600 Dec 16 13:31:51 crc kubenswrapper[4757]: I1216 13:31:51.934960 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="37a145676157cb0ad283f5e08ca036f53f30097a029365f1169b057c03bf3c30" exitCode=0 Dec 16 13:31:51 crc kubenswrapper[4757]: I1216 13:31:51.935050 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"37a145676157cb0ad283f5e08ca036f53f30097a029365f1169b057c03bf3c30"} Dec 16 13:31:51 crc kubenswrapper[4757]: I1216 13:31:51.935643 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec"} Dec 16 13:31:51 crc kubenswrapper[4757]: I1216 13:31:51.935668 4757 scope.go:117] "RemoveContainer" containerID="efe401eed78cf9f8a3b3b9666c76df265cc4db2ad9092a186c4e406994be54cf" Dec 16 13:31:53 crc kubenswrapper[4757]: I1216 13:31:53.004754 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5b94l" Dec 16 13:31:53 crc kubenswrapper[4757]: I1216 13:31:53.059860 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5b94l" Dec 16 13:31:53 crc kubenswrapper[4757]: I1216 13:31:53.258308 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5b94l"] Dec 16 13:31:54 crc kubenswrapper[4757]: I1216 13:31:54.965234 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5b94l" podUID="f109383e-d114-4db2-8c9e-36d2b97189cb" containerName="registry-server" containerID="cri-o://349c6e1da71803dc5cbdbc7b03f984be06fd442c6b23841b656d753848ab5e47" gracePeriod=2 Dec 16 13:31:55 crc kubenswrapper[4757]: I1216 13:31:55.432549 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5b94l" Dec 16 13:31:55 crc kubenswrapper[4757]: I1216 13:31:55.529639 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f109383e-d114-4db2-8c9e-36d2b97189cb-catalog-content\") pod \"f109383e-d114-4db2-8c9e-36d2b97189cb\" (UID: \"f109383e-d114-4db2-8c9e-36d2b97189cb\") " Dec 16 13:31:55 crc kubenswrapper[4757]: I1216 13:31:55.529729 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f109383e-d114-4db2-8c9e-36d2b97189cb-utilities\") pod \"f109383e-d114-4db2-8c9e-36d2b97189cb\" (UID: \"f109383e-d114-4db2-8c9e-36d2b97189cb\") " Dec 16 13:31:55 crc kubenswrapper[4757]: I1216 13:31:55.529927 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp4fs\" (UniqueName: \"kubernetes.io/projected/f109383e-d114-4db2-8c9e-36d2b97189cb-kube-api-access-bp4fs\") pod \"f109383e-d114-4db2-8c9e-36d2b97189cb\" (UID: \"f109383e-d114-4db2-8c9e-36d2b97189cb\") " Dec 16 13:31:55 crc kubenswrapper[4757]: I1216 13:31:55.531356 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f109383e-d114-4db2-8c9e-36d2b97189cb-utilities" (OuterVolumeSpecName: "utilities") pod "f109383e-d114-4db2-8c9e-36d2b97189cb" (UID: "f109383e-d114-4db2-8c9e-36d2b97189cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:31:55 crc kubenswrapper[4757]: I1216 13:31:55.537467 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f109383e-d114-4db2-8c9e-36d2b97189cb-kube-api-access-bp4fs" (OuterVolumeSpecName: "kube-api-access-bp4fs") pod "f109383e-d114-4db2-8c9e-36d2b97189cb" (UID: "f109383e-d114-4db2-8c9e-36d2b97189cb"). InnerVolumeSpecName "kube-api-access-bp4fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:31:55 crc kubenswrapper[4757]: I1216 13:31:55.632218 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f109383e-d114-4db2-8c9e-36d2b97189cb-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:31:55 crc kubenswrapper[4757]: I1216 13:31:55.632254 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp4fs\" (UniqueName: \"kubernetes.io/projected/f109383e-d114-4db2-8c9e-36d2b97189cb-kube-api-access-bp4fs\") on node \"crc\" DevicePath \"\"" Dec 16 13:31:55 crc kubenswrapper[4757]: I1216 13:31:55.655361 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f109383e-d114-4db2-8c9e-36d2b97189cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f109383e-d114-4db2-8c9e-36d2b97189cb" (UID: "f109383e-d114-4db2-8c9e-36d2b97189cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:31:55 crc kubenswrapper[4757]: I1216 13:31:55.733911 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f109383e-d114-4db2-8c9e-36d2b97189cb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:31:55 crc kubenswrapper[4757]: I1216 13:31:55.981457 4757 generic.go:334] "Generic (PLEG): container finished" podID="f109383e-d114-4db2-8c9e-36d2b97189cb" containerID="349c6e1da71803dc5cbdbc7b03f984be06fd442c6b23841b656d753848ab5e47" exitCode=0 Dec 16 13:31:55 crc kubenswrapper[4757]: I1216 13:31:55.981527 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5b94l" Dec 16 13:31:55 crc kubenswrapper[4757]: I1216 13:31:55.981514 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5b94l" event={"ID":"f109383e-d114-4db2-8c9e-36d2b97189cb","Type":"ContainerDied","Data":"349c6e1da71803dc5cbdbc7b03f984be06fd442c6b23841b656d753848ab5e47"} Dec 16 13:31:55 crc kubenswrapper[4757]: I1216 13:31:55.981906 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5b94l" event={"ID":"f109383e-d114-4db2-8c9e-36d2b97189cb","Type":"ContainerDied","Data":"c86ade33daf1f5cd9adb78a5094f33342d717c5dceb536af38d63fb857fcef13"} Dec 16 13:31:55 crc kubenswrapper[4757]: I1216 13:31:55.981939 4757 scope.go:117] "RemoveContainer" containerID="349c6e1da71803dc5cbdbc7b03f984be06fd442c6b23841b656d753848ab5e47" Dec 16 13:31:56 crc kubenswrapper[4757]: I1216 13:31:56.012638 4757 scope.go:117] "RemoveContainer" containerID="e9709bd409d75995be1482ed5424d8c49721e73d95a109e70809b04b55ae7cbc" Dec 16 13:31:56 crc kubenswrapper[4757]: I1216 13:31:56.019163 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5b94l"] Dec 16 13:31:56 crc kubenswrapper[4757]: I1216 13:31:56.027540 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5b94l"] Dec 16 13:31:56 crc kubenswrapper[4757]: I1216 13:31:56.048693 4757 scope.go:117] "RemoveContainer" containerID="85b31240f2cd883c6d9f4544d5c7174db2a80c87cd5baba8c47e2f3c93892aef" Dec 16 13:31:56 crc kubenswrapper[4757]: I1216 13:31:56.097204 4757 scope.go:117] "RemoveContainer" containerID="349c6e1da71803dc5cbdbc7b03f984be06fd442c6b23841b656d753848ab5e47" Dec 16 13:31:56 crc kubenswrapper[4757]: E1216 13:31:56.097834 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"349c6e1da71803dc5cbdbc7b03f984be06fd442c6b23841b656d753848ab5e47\": container with ID starting with 349c6e1da71803dc5cbdbc7b03f984be06fd442c6b23841b656d753848ab5e47 not found: ID does not exist" containerID="349c6e1da71803dc5cbdbc7b03f984be06fd442c6b23841b656d753848ab5e47" Dec 16 13:31:56 crc kubenswrapper[4757]: I1216 13:31:56.097871 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"349c6e1da71803dc5cbdbc7b03f984be06fd442c6b23841b656d753848ab5e47"} err="failed to get container status \"349c6e1da71803dc5cbdbc7b03f984be06fd442c6b23841b656d753848ab5e47\": rpc error: code = NotFound desc = could not find container \"349c6e1da71803dc5cbdbc7b03f984be06fd442c6b23841b656d753848ab5e47\": container with ID starting with 349c6e1da71803dc5cbdbc7b03f984be06fd442c6b23841b656d753848ab5e47 not found: ID does not exist" Dec 16 13:31:56 crc kubenswrapper[4757]: I1216 13:31:56.097896 4757 scope.go:117] "RemoveContainer" containerID="e9709bd409d75995be1482ed5424d8c49721e73d95a109e70809b04b55ae7cbc" Dec 16 13:31:56 crc kubenswrapper[4757]: E1216 13:31:56.099166 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9709bd409d75995be1482ed5424d8c49721e73d95a109e70809b04b55ae7cbc\": container with ID starting with e9709bd409d75995be1482ed5424d8c49721e73d95a109e70809b04b55ae7cbc not found: ID does not exist" containerID="e9709bd409d75995be1482ed5424d8c49721e73d95a109e70809b04b55ae7cbc" Dec 16 13:31:56 crc kubenswrapper[4757]: I1216 13:31:56.099197 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9709bd409d75995be1482ed5424d8c49721e73d95a109e70809b04b55ae7cbc"} err="failed to get container status \"e9709bd409d75995be1482ed5424d8c49721e73d95a109e70809b04b55ae7cbc\": rpc error: code = NotFound desc = could not find container \"e9709bd409d75995be1482ed5424d8c49721e73d95a109e70809b04b55ae7cbc\": container with ID starting with e9709bd409d75995be1482ed5424d8c49721e73d95a109e70809b04b55ae7cbc not found: ID does not exist" Dec 16 13:31:56 crc kubenswrapper[4757]: I1216 13:31:56.099217 4757 scope.go:117] "RemoveContainer" containerID="85b31240f2cd883c6d9f4544d5c7174db2a80c87cd5baba8c47e2f3c93892aef" Dec 16 13:31:56 crc kubenswrapper[4757]: E1216 13:31:56.099547 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85b31240f2cd883c6d9f4544d5c7174db2a80c87cd5baba8c47e2f3c93892aef\": container with ID starting with 85b31240f2cd883c6d9f4544d5c7174db2a80c87cd5baba8c47e2f3c93892aef not found: ID does not exist" containerID="85b31240f2cd883c6d9f4544d5c7174db2a80c87cd5baba8c47e2f3c93892aef" Dec 16 13:31:56 crc kubenswrapper[4757]: I1216 13:31:56.099578 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85b31240f2cd883c6d9f4544d5c7174db2a80c87cd5baba8c47e2f3c93892aef"} err="failed to get container status \"85b31240f2cd883c6d9f4544d5c7174db2a80c87cd5baba8c47e2f3c93892aef\": rpc error: code = NotFound desc = could not find container \"85b31240f2cd883c6d9f4544d5c7174db2a80c87cd5baba8c47e2f3c93892aef\": container with ID starting with 85b31240f2cd883c6d9f4544d5c7174db2a80c87cd5baba8c47e2f3c93892aef not found: ID does not exist" Dec 16 13:31:56 crc kubenswrapper[4757]: I1216 13:31:56.960129 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f109383e-d114-4db2-8c9e-36d2b97189cb" path="/var/lib/kubelet/pods/f109383e-d114-4db2-8c9e-36d2b97189cb/volumes" Dec 16 13:33:04 crc kubenswrapper[4757]: I1216 13:33:04.585897 4757 generic.go:334] "Generic (PLEG): container finished" podID="d146c06e-d73a-47a2-8e1f-07ca485b1a72" containerID="4afe571b537e77902bc8c0a3d28d9232ccf2b91987d9ebd536d0f94e56ddd733" exitCode=0 Dec 16 13:33:04 crc kubenswrapper[4757]: I1216 13:33:04.585972 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" event={"ID":"d146c06e-d73a-47a2-8e1f-07ca485b1a72","Type":"ContainerDied","Data":"4afe571b537e77902bc8c0a3d28d9232ccf2b91987d9ebd536d0f94e56ddd733"} Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.024331 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.197492 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-ssh-key\") pod \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.197550 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-libvirt-secret-0\") pod \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.197587 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-libvirt-combined-ca-bundle\") pod \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.197688 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-inventory\") pod \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.197717 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwdj4\" (UniqueName: \"kubernetes.io/projected/d146c06e-d73a-47a2-8e1f-07ca485b1a72-kube-api-access-hwdj4\") pod \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\" (UID: \"d146c06e-d73a-47a2-8e1f-07ca485b1a72\") " Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.205152 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d146c06e-d73a-47a2-8e1f-07ca485b1a72-kube-api-access-hwdj4" (OuterVolumeSpecName: "kube-api-access-hwdj4") pod "d146c06e-d73a-47a2-8e1f-07ca485b1a72" (UID: "d146c06e-d73a-47a2-8e1f-07ca485b1a72"). InnerVolumeSpecName "kube-api-access-hwdj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.206025 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d146c06e-d73a-47a2-8e1f-07ca485b1a72" (UID: "d146c06e-d73a-47a2-8e1f-07ca485b1a72"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.226430 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "d146c06e-d73a-47a2-8e1f-07ca485b1a72" (UID: "d146c06e-d73a-47a2-8e1f-07ca485b1a72"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.237144 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-inventory" (OuterVolumeSpecName: "inventory") pod "d146c06e-d73a-47a2-8e1f-07ca485b1a72" (UID: "d146c06e-d73a-47a2-8e1f-07ca485b1a72"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.249787 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d146c06e-d73a-47a2-8e1f-07ca485b1a72" (UID: "d146c06e-d73a-47a2-8e1f-07ca485b1a72"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.299946 4757 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.299984 4757 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.299998 4757 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.300026 4757 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d146c06e-d73a-47a2-8e1f-07ca485b1a72-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.300058 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwdj4\" (UniqueName: \"kubernetes.io/projected/d146c06e-d73a-47a2-8e1f-07ca485b1a72-kube-api-access-hwdj4\") on node \"crc\" DevicePath \"\"" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.611900 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" event={"ID":"d146c06e-d73a-47a2-8e1f-07ca485b1a72","Type":"ContainerDied","Data":"3eae308130bf12c6adc09a0f50769fbd626582d329a8d7475345c9ea81290504"} Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.611952 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eae308130bf12c6adc09a0f50769fbd626582d329a8d7475345c9ea81290504" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.612038 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.711662 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd"] Dec 16 13:33:06 crc kubenswrapper[4757]: E1216 13:33:06.712127 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d146c06e-d73a-47a2-8e1f-07ca485b1a72" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.712145 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d146c06e-d73a-47a2-8e1f-07ca485b1a72" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 16 13:33:06 crc kubenswrapper[4757]: E1216 13:33:06.712161 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f109383e-d114-4db2-8c9e-36d2b97189cb" containerName="extract-utilities" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.712170 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="f109383e-d114-4db2-8c9e-36d2b97189cb" containerName="extract-utilities" Dec 16 13:33:06 crc kubenswrapper[4757]: E1216 13:33:06.712207 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f109383e-d114-4db2-8c9e-36d2b97189cb" containerName="extract-content" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.712213 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="f109383e-d114-4db2-8c9e-36d2b97189cb" containerName="extract-content" Dec 16 13:33:06 crc kubenswrapper[4757]: E1216 13:33:06.712238 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f109383e-d114-4db2-8c9e-36d2b97189cb" containerName="registry-server" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.712244 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="f109383e-d114-4db2-8c9e-36d2b97189cb" containerName="registry-server" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.712428 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="d146c06e-d73a-47a2-8e1f-07ca485b1a72" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.712451 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="f109383e-d114-4db2-8c9e-36d2b97189cb" containerName="registry-server" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.713479 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.718598 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.718900 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.719096 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.719281 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.719410 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.719508 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.722558 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.747143 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd"] Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.808045 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.808434 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/13b0d4c7-5eab-400a-9513-9391342fffee-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.808464 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.808512 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv2m5\" (UniqueName: \"kubernetes.io/projected/13b0d4c7-5eab-400a-9513-9391342fffee-kube-api-access-bv2m5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.808559 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.808628 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.808683 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.808732 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.808769 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.909973 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.910081 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.910123 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.910165 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.910195 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/13b0d4c7-5eab-400a-9513-9391342fffee-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.910219 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.910271 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv2m5\" (UniqueName: \"kubernetes.io/projected/13b0d4c7-5eab-400a-9513-9391342fffee-kube-api-access-bv2m5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.910309 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.910377 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.911751 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/13b0d4c7-5eab-400a-9513-9391342fffee-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.916458 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.916949 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.919514 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.919805 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.920289 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.921541 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.923265 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:06 crc kubenswrapper[4757]: I1216 13:33:06.932255 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv2m5\" (UniqueName: \"kubernetes.io/projected/13b0d4c7-5eab-400a-9513-9391342fffee-kube-api-access-bv2m5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-8dcjd\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:07 crc kubenswrapper[4757]: I1216 13:33:07.035895 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:33:07 crc kubenswrapper[4757]: I1216 13:33:07.608707 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd"] Dec 16 13:33:07 crc kubenswrapper[4757]: I1216 13:33:07.626133 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" event={"ID":"13b0d4c7-5eab-400a-9513-9391342fffee","Type":"ContainerStarted","Data":"20efffac4799d0a2f944f95a039ff006ce8bbf18839798cdb7055533e9ed6a8b"} Dec 16 13:33:08 crc kubenswrapper[4757]: I1216 13:33:08.638324 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" event={"ID":"13b0d4c7-5eab-400a-9513-9391342fffee","Type":"ContainerStarted","Data":"a89ecd7bc1cd7939fe167012e4296c920ab9110c143d3e6db32f782e7a07ae88"} Dec 16 13:33:08 crc kubenswrapper[4757]: I1216 13:33:08.663518 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" podStartSLOduration=2.14473167 podStartE2EDuration="2.663494026s" podCreationTimestamp="2025-12-16 13:33:06 +0000 UTC" firstStartedPulling="2025-12-16 13:33:07.606078112 +0000 UTC m=+2773.033821918" lastFinishedPulling="2025-12-16 13:33:08.124840478 +0000 UTC m=+2773.552584274" observedRunningTime="2025-12-16 13:33:08.656664048 +0000 UTC m=+2774.084407844" watchObservedRunningTime="2025-12-16 13:33:08.663494026 +0000 UTC m=+2774.091237842" Dec 16 13:33:30 crc kubenswrapper[4757]: I1216 13:33:30.643384 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gpnfn"] Dec 16 13:33:30 crc kubenswrapper[4757]: I1216 13:33:30.647381 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gpnfn" Dec 16 13:33:30 crc kubenswrapper[4757]: I1216 13:33:30.683277 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpnfn"] Dec 16 13:33:30 crc kubenswrapper[4757]: I1216 13:33:30.770184 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm6h9\" (UniqueName: \"kubernetes.io/projected/15380341-a6d5-4aa4-8447-05b61eaa936b-kube-api-access-rm6h9\") pod \"redhat-marketplace-gpnfn\" (UID: \"15380341-a6d5-4aa4-8447-05b61eaa936b\") " pod="openshift-marketplace/redhat-marketplace-gpnfn" Dec 16 13:33:30 crc kubenswrapper[4757]: I1216 13:33:30.770286 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15380341-a6d5-4aa4-8447-05b61eaa936b-catalog-content\") pod \"redhat-marketplace-gpnfn\" (UID: \"15380341-a6d5-4aa4-8447-05b61eaa936b\") " pod="openshift-marketplace/redhat-marketplace-gpnfn" Dec 16 13:33:30 crc kubenswrapper[4757]: I1216 13:33:30.770344 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15380341-a6d5-4aa4-8447-05b61eaa936b-utilities\") pod \"redhat-marketplace-gpnfn\" (UID: \"15380341-a6d5-4aa4-8447-05b61eaa936b\") " pod="openshift-marketplace/redhat-marketplace-gpnfn" Dec 16 13:33:30 crc kubenswrapper[4757]: I1216 13:33:30.872398 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm6h9\" (UniqueName: \"kubernetes.io/projected/15380341-a6d5-4aa4-8447-05b61eaa936b-kube-api-access-rm6h9\") pod \"redhat-marketplace-gpnfn\" (UID: \"15380341-a6d5-4aa4-8447-05b61eaa936b\") " pod="openshift-marketplace/redhat-marketplace-gpnfn" Dec 16 13:33:30 crc kubenswrapper[4757]: I1216 13:33:30.872551 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15380341-a6d5-4aa4-8447-05b61eaa936b-catalog-content\") pod \"redhat-marketplace-gpnfn\" (UID: \"15380341-a6d5-4aa4-8447-05b61eaa936b\") " pod="openshift-marketplace/redhat-marketplace-gpnfn" Dec 16 13:33:30 crc kubenswrapper[4757]: I1216 13:33:30.872604 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15380341-a6d5-4aa4-8447-05b61eaa936b-utilities\") pod \"redhat-marketplace-gpnfn\" (UID: \"15380341-a6d5-4aa4-8447-05b61eaa936b\") " pod="openshift-marketplace/redhat-marketplace-gpnfn" Dec 16 13:33:30 crc kubenswrapper[4757]: I1216 13:33:30.873110 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15380341-a6d5-4aa4-8447-05b61eaa936b-utilities\") pod \"redhat-marketplace-gpnfn\" (UID: \"15380341-a6d5-4aa4-8447-05b61eaa936b\") " pod="openshift-marketplace/redhat-marketplace-gpnfn" Dec 16 13:33:30 crc kubenswrapper[4757]: I1216 13:33:30.873112 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15380341-a6d5-4aa4-8447-05b61eaa936b-catalog-content\") pod \"redhat-marketplace-gpnfn\" (UID: \"15380341-a6d5-4aa4-8447-05b61eaa936b\") " pod="openshift-marketplace/redhat-marketplace-gpnfn" Dec 16 13:33:30 crc kubenswrapper[4757]: I1216 13:33:30.891193 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm6h9\" (UniqueName: \"kubernetes.io/projected/15380341-a6d5-4aa4-8447-05b61eaa936b-kube-api-access-rm6h9\") pod \"redhat-marketplace-gpnfn\" (UID: \"15380341-a6d5-4aa4-8447-05b61eaa936b\") " pod="openshift-marketplace/redhat-marketplace-gpnfn" Dec 16 13:33:30 crc kubenswrapper[4757]: I1216 13:33:30.973579 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gpnfn" Dec 16 13:33:31 crc kubenswrapper[4757]: I1216 13:33:31.451201 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpnfn"] Dec 16 13:33:31 crc kubenswrapper[4757]: I1216 13:33:31.848187 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpnfn" event={"ID":"15380341-a6d5-4aa4-8447-05b61eaa936b","Type":"ContainerStarted","Data":"47132003a947ed661ffe43be48de291d893ed3ef9f1d09ae62cfbe5e5cc2f578"} Dec 16 13:34:01 crc kubenswrapper[4757]: I1216 13:34:01.291771 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="65a6f73f-1823-407a-a9f2-c693e5ddcca9" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Dec 16 13:34:01 crc kubenswrapper[4757]: I1216 13:34:01.302822 4757 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5l22d container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 13:34:01 crc kubenswrapper[4757]: I1216 13:34:01.331469 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" podUID="6a9f576c-c2af-47cb-8a4d-f7d8784aad87" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 13:34:01 crc kubenswrapper[4757]: I1216 13:34:01.302842 4757 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5l22d container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 13:34:01 crc kubenswrapper[4757]: I1216 13:34:01.331814 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5l22d" podUID="6a9f576c-c2af-47cb-8a4d-f7d8784aad87" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 13:34:01 crc kubenswrapper[4757]: I1216 13:34:01.393722 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-544dfc5bc-8q666" podUID="38a8b3dc-7995-4851-96db-0fb6749669b9" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.169:8080/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:34:02 crc kubenswrapper[4757]: I1216 13:34:01.685501 4757 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 16 13:34:02 crc kubenswrapper[4757]: I1216 13:34:01.772017 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 16 13:34:02 crc kubenswrapper[4757]: I1216 13:34:01.777583 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" podUID="e9f15431-d8cd-408d-8169-e06457cabccc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.83:8081/healthz\": EOF" Dec 16 13:34:02 crc kubenswrapper[4757]: I1216 13:34:01.729802 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" podUID="6333c537-0505-48c0-b197-a609084a2a2c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/readyz\": EOF" Dec 16 13:34:02 crc kubenswrapper[4757]: I1216 13:34:01.737224 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" podUID="67c1a6c7-35d1-48a9-a058-13e5d5599fe7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/readyz\": EOF" Dec 16 13:34:02 crc kubenswrapper[4757]: I1216 13:34:01.771315 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" podUID="e9f15431-d8cd-408d-8169-e06457cabccc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.83:8081/readyz\": dial tcp 10.217.0.83:8081: connect: connection refused" Dec 16 13:34:02 crc kubenswrapper[4757]: I1216 13:34:01.771970 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" podUID="67c1a6c7-35d1-48a9-a058-13e5d5599fe7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/healthz\": dial tcp 10.217.0.85:8081: connect: connection refused" Dec 16 13:34:02 crc kubenswrapper[4757]: E1216 13:34:01.699147 4757 kubelet.go:2359] "Skipping pod synchronization" err="container runtime is down" Dec 16 13:34:02 crc kubenswrapper[4757]: I1216 13:34:01.739920 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:34:02 crc kubenswrapper[4757]: I1216 13:34:01.816513 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:34:02 crc kubenswrapper[4757]: I1216 13:34:01.747931 4757 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 16 13:34:02 crc kubenswrapper[4757]: I1216 13:34:01.816573 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 16 13:34:02 crc kubenswrapper[4757]: I1216 13:34:01.713752 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" podUID="6333c537-0505-48c0-b197-a609084a2a2c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/healthz\": EOF" Dec 16 13:34:02 crc kubenswrapper[4757]: I1216 13:34:01.854547 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" podUID="019f84e1-6fee-4829-a087-c756c955060a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.47:8080/readyz\": EOF" Dec 16 13:34:02 crc kubenswrapper[4757]: I1216 13:34:01.856760 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podUID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.89:8081/healthz\": EOF" Dec 16 13:34:02 crc kubenswrapper[4757]: I1216 13:34:01.868729 4757 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 16 13:34:02 crc kubenswrapper[4757]: I1216 13:34:01.897303 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 16 13:34:02 crc kubenswrapper[4757]: I1216 13:34:01.897500 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-544dfc5bc-8q666" podUID="38a8b3dc-7995-4851-96db-0fb6749669b9" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.169:8080/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:34:02 crc kubenswrapper[4757]: E1216 13:34:01.949604 4757 kubelet.go:2359] "Skipping pod synchronization" err="container runtime is down" Dec 16 13:34:02 crc kubenswrapper[4757]: I1216 13:34:01.954455 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podUID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.89:8081/readyz\": EOF" Dec 16 13:34:02 crc kubenswrapper[4757]: E1216 13:34:02.342061 4757 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9f15431_d8cd_408d_8169_e06457cabccc.slice/crio-873f6a8ede2fa9237135bad178b8e1733170cab1fde491de8c0d92c1f749eb42.scope\": RecentStats: unable to find data in memory cache]" Dec 16 13:34:03 crc kubenswrapper[4757]: I1216 13:34:03.357800 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" podUID="67c1a6c7-35d1-48a9-a058-13e5d5599fe7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/healthz\": dial tcp 10.217.0.85:8081: connect: connection refused" Dec 16 13:34:03 crc kubenswrapper[4757]: I1216 13:34:03.357901 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" podUID="67c1a6c7-35d1-48a9-a058-13e5d5599fe7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/readyz\": dial tcp 10.217.0.85:8081: connect: connection refused" Dec 16 13:34:04 crc kubenswrapper[4757]: I1216 13:34:04.126371 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" podUID="019f84e1-6fee-4829-a087-c756c955060a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.47:8080/readyz\": dial tcp 10.217.0.47:8080: connect: connection refused" Dec 16 13:34:05 crc kubenswrapper[4757]: I1216 13:34:05.543593 4757 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 16 13:34:05 crc kubenswrapper[4757]: I1216 13:34:05.543953 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 16 13:34:06 crc kubenswrapper[4757]: I1216 13:34:06.624388 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" podUID="e9f15431-d8cd-408d-8169-e06457cabccc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.83:8081/readyz\": dial tcp 10.217.0.83:8081: connect: connection refused" Dec 16 13:34:06 crc kubenswrapper[4757]: I1216 13:34:06.624951 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" podUID="e9f15431-d8cd-408d-8169-e06457cabccc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.83:8081/healthz\": dial tcp 10.217.0.83:8081: connect: connection refused" Dec 16 13:34:06 crc kubenswrapper[4757]: I1216 13:34:06.888758 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" podUID="6333c537-0505-48c0-b197-a609084a2a2c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/readyz\": dial tcp 10.217.0.77:8081: connect: connection refused" Dec 16 13:34:06 crc kubenswrapper[4757]: I1216 13:34:06.890112 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" podUID="6333c537-0505-48c0-b197-a609084a2a2c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/healthz\": dial tcp 10.217.0.77:8081: connect: connection refused" Dec 16 13:34:07 crc kubenswrapper[4757]: I1216 13:34:07.798684 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="65a6f73f-1823-407a-a9f2-c693e5ddcca9" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Dec 16 13:34:07 crc kubenswrapper[4757]: I1216 13:34:07.801550 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="65a6f73f-1823-407a-a9f2-c693e5ddcca9" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Dec 16 13:34:07 crc kubenswrapper[4757]: I1216 13:34:07.867019 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podUID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.89:8081/readyz\": dial tcp 10.217.0.89:8081: connect: connection refused" Dec 16 13:34:07 crc kubenswrapper[4757]: I1216 13:34:07.867218 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podUID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.89:8081/healthz\": dial tcp 10.217.0.89:8081: connect: connection refused" Dec 16 13:34:11 crc kubenswrapper[4757]: I1216 13:34:11.517203 4757 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 16 13:34:11 crc kubenswrapper[4757]: I1216 13:34:11.517634 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 16 13:34:11 crc kubenswrapper[4757]: I1216 13:34:11.517843 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 13:34:11 crc kubenswrapper[4757]: I1216 13:34:11.518726 4757 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 16 13:34:11 crc kubenswrapper[4757]: I1216 13:34:11.518802 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 16 13:34:12 crc kubenswrapper[4757]: I1216 13:34:12.799913 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="65a6f73f-1823-407a-a9f2-c693e5ddcca9" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Dec 16 13:34:13 crc kubenswrapper[4757]: I1216 13:34:13.356966 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" podUID="67c1a6c7-35d1-48a9-a058-13e5d5599fe7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/readyz\": dial tcp 10.217.0.85:8081: connect: connection refused" Dec 16 13:34:13 crc kubenswrapper[4757]: I1216 13:34:13.357175 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:34:13 crc kubenswrapper[4757]: I1216 13:34:13.358306 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" podUID="67c1a6c7-35d1-48a9-a058-13e5d5599fe7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/readyz\": dial tcp 10.217.0.85:8081: connect: connection refused" Dec 16 13:34:14 crc kubenswrapper[4757]: I1216 13:34:14.126449 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" podUID="019f84e1-6fee-4829-a087-c756c955060a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.47:8080/readyz\": dial tcp 10.217.0.47:8080: connect: connection refused" Dec 16 13:34:14 crc kubenswrapper[4757]: I1216 13:34:14.126580 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" Dec 16 13:34:14 crc kubenswrapper[4757]: I1216 13:34:14.127426 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" podUID="019f84e1-6fee-4829-a087-c756c955060a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.47:8080/readyz\": dial tcp 10.217.0.47:8080: connect: connection refused" Dec 16 13:34:15 crc kubenswrapper[4757]: I1216 13:34:15.543743 4757 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 16 13:34:15 crc kubenswrapper[4757]: I1216 13:34:15.544180 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 16 13:34:15 crc kubenswrapper[4757]: I1216 13:34:15.544235 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 13:34:15 crc kubenswrapper[4757]: I1216 13:34:15.545130 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"31cc4cc27906ee7e5415e9f8ee9159e8dbf3106af6cdc17089aebbc68aae2cd6"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed liveness probe, will be restarted" Dec 16 13:34:15 crc kubenswrapper[4757]: I1216 13:34:15.545203 4757 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 16 13:34:15 crc kubenswrapper[4757]: I1216 13:34:15.545244 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://31cc4cc27906ee7e5415e9f8ee9159e8dbf3106af6cdc17089aebbc68aae2cd6" gracePeriod=30 Dec 16 13:34:15 crc kubenswrapper[4757]: I1216 13:34:15.545253 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 16 13:34:16 crc kubenswrapper[4757]: I1216 13:34:16.623865 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" podUID="e9f15431-d8cd-408d-8169-e06457cabccc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.83:8081/readyz\": dial tcp 10.217.0.83:8081: connect: connection refused" Dec 16 13:34:16 crc kubenswrapper[4757]: I1216 13:34:16.624352 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" Dec 16 13:34:16 crc kubenswrapper[4757]: I1216 13:34:16.625327 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" podUID="e9f15431-d8cd-408d-8169-e06457cabccc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.83:8081/readyz\": dial tcp 10.217.0.83:8081: connect: connection refused" Dec 16 13:34:16 crc kubenswrapper[4757]: I1216 13:34:16.889807 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" podUID="6333c537-0505-48c0-b197-a609084a2a2c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/readyz\": dial tcp 10.217.0.77:8081: connect: connection refused" Dec 16 13:34:16 crc kubenswrapper[4757]: I1216 13:34:16.889921 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" Dec 16 13:34:16 crc kubenswrapper[4757]: I1216 13:34:16.891225 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" podUID="6333c537-0505-48c0-b197-a609084a2a2c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/readyz\": dial tcp 10.217.0.77:8081: connect: connection refused" Dec 16 13:34:17 crc kubenswrapper[4757]: I1216 13:34:17.801351 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="65a6f73f-1823-407a-a9f2-c693e5ddcca9" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Dec 16 13:34:17 crc kubenswrapper[4757]: I1216 13:34:17.802023 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Dec 16 13:34:17 crc kubenswrapper[4757]: I1216 13:34:17.802778 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"efccb4b3e660304aaaf43832f8e13cbebaf5a9f019ac85d25fea4b90b97c3966"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Dec 16 13:34:17 crc kubenswrapper[4757]: I1216 13:34:17.802879 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65a6f73f-1823-407a-a9f2-c693e5ddcca9" containerName="ceilometer-central-agent" containerID="cri-o://efccb4b3e660304aaaf43832f8e13cbebaf5a9f019ac85d25fea4b90b97c3966" gracePeriod=30 Dec 16 13:34:17 crc kubenswrapper[4757]: I1216 13:34:17.866586 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podUID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.89:8081/readyz\": dial tcp 10.217.0.89:8081: connect: connection refused" Dec 16 13:34:17 crc kubenswrapper[4757]: I1216 13:34:17.866944 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" Dec 16 13:34:17 crc kubenswrapper[4757]: I1216 13:34:17.868059 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podUID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.89:8081/readyz\": dial tcp 10.217.0.89:8081: connect: connection refused" Dec 16 13:34:17 crc kubenswrapper[4757]: I1216 13:34:17.948319 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-psxvw" podUID="75d829d5-a3cd-48c6-8aff-07f7d325b4f9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.80:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:34:20 crc kubenswrapper[4757]: I1216 13:34:20.797049 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="c2d16196-ea98-44e5-b859-bea9a8392c01" containerName="galera" probeResult="failure" output="command timed out" Dec 16 13:34:20 crc kubenswrapper[4757]: I1216 13:34:20.798222 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="c2d16196-ea98-44e5-b859-bea9a8392c01" containerName="galera" probeResult="failure" output="command timed out" Dec 16 13:34:21 crc kubenswrapper[4757]: I1216 13:34:21.181906 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:34:21 crc kubenswrapper[4757]: I1216 13:34:21.182312 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:34:21 crc kubenswrapper[4757]: I1216 13:34:21.517655 4757 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 16 13:34:21 crc kubenswrapper[4757]: I1216 13:34:21.518070 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 16 13:34:21 crc kubenswrapper[4757]: I1216 13:34:21.798747 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="65a6f73f-1823-407a-a9f2-c693e5ddcca9" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Dec 16 13:34:23 crc kubenswrapper[4757]: I1216 13:34:23.357207 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" podUID="67c1a6c7-35d1-48a9-a058-13e5d5599fe7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/readyz\": dial tcp 10.217.0.85:8081: connect: connection refused" Dec 16 13:34:23 crc kubenswrapper[4757]: I1216 13:34:23.357207 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" podUID="67c1a6c7-35d1-48a9-a058-13e5d5599fe7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/healthz\": dial tcp 10.217.0.85:8081: connect: connection refused" Dec 16 13:34:23 crc kubenswrapper[4757]: I1216 13:34:23.357359 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:34:23 crc kubenswrapper[4757]: I1216 13:34:23.357779 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" podUID="67c1a6c7-35d1-48a9-a058-13e5d5599fe7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/readyz\": dial tcp 10.217.0.85:8081: connect: connection refused" Dec 16 13:34:23 crc kubenswrapper[4757]: I1216 13:34:23.358075 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="manager" containerStatusID={"Type":"cri-o","ID":"470eb1409f4aed957b7b2a9038ea8e32c408a7d6788cc603bb5830d7c3eb700a"} pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" containerMessage="Container manager failed liveness probe, will be restarted" Dec 16 13:34:23 crc kubenswrapper[4757]: I1216 13:34:23.358116 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" podUID="67c1a6c7-35d1-48a9-a058-13e5d5599fe7" containerName="manager" containerID="cri-o://470eb1409f4aed957b7b2a9038ea8e32c408a7d6788cc603bb5830d7c3eb700a" gracePeriod=10 Dec 16 13:34:24 crc kubenswrapper[4757]: I1216 13:34:24.125519 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" podUID="019f84e1-6fee-4829-a087-c756c955060a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.47:8080/readyz\": dial tcp 10.217.0.47:8080: connect: connection refused" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.146211 4757 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 3.656985204s: [/var/lib/containers/storage/overlay/5fe571fc8d39edbf1bdeb9474116f8d6499970ea45e4000c6b5d4c9d83679922/diff /var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/frr/0.log]; will not log again for this container unless duration exceeds 2s Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.146301 4757 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 3.418036449s: [/var/lib/containers/storage/overlay/1d0216852745d12d13fa8c00ab5d351f4b95e51c483580ae9918deb6478c65b5/diff /var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/frr-metrics/0.log]; will not log again for this container unless duration exceeds 2s Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.146436 4757 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.982578726s: [/var/lib/containers/storage/overlay/1e6edf57e94be37dadc71ad55b00364f3e5f275233fac38fb9fc555e4ab60fcc/diff /var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/kube-rbac-proxy-frr/0.log]; will not log again for this container unless duration exceeds 2s Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.146659 4757 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 3.174272705s: [/var/lib/containers/storage/overlay/daa64c34abd9a806016a3657dca4794a73f1d70b1f4a8c6b9dd7a2b9ba527414/diff /var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/kube-rbac-proxy/0.log]; will not log again for this container unless duration exceeds 2s Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.151702 4757 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 3.857994073s: [/var/lib/containers/storage/overlay/c50e6ccde71b6b3c8f0581400641b486aabc540bc0db21239f89ec39dfe62ad3/diff /var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/controller/0.log]; will not log again for this container unless duration exceeds 2s Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.152071 4757 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 3.583487759s: [/var/lib/containers/storage/overlay/753bab28d6d0a800b454d304a974ef21bd519518dc86304374b1337f4f384814/diff /var/log/pods/openstack_ovsdbserver-nb-0_89cc68a0-15fd-4a20-bd71-9c8acb5a92c7/openstack-network-exporter/0.log]; will not log again for this container unless duration exceeds 2s Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.152303 4757 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 3.578572919s: [/var/lib/containers/storage/overlay/c963b6c8b650447b6915b7b745818d2779bc22acb0ba9a52919b3bc9efd5052f/diff /var/log/pods/openstack_ovsdbserver-sb-0_972a26d6-4f3b-4fc4-8e86-055dfe33652a/openstack-network-exporter/0.log]; will not log again for this container unless duration exceeds 2s Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.152487 4757 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 3.552661729s: [/var/lib/containers/storage/overlay/ee4e2dc6ad005c175244e9b8ea3c67c007b1f759f91f5ab9d1df9e5f9d09751b/diff /var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/reloader/0.log]; will not log again for this container unless duration exceeds 2s Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.156994 4757 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.316344792s: [/var/lib/containers/storage/overlay/dc5e8af94e5df06b64e7c5acb3cf1e9843f08c7540c9134f492fe6c3de91889c/diff /var/log/pods/openshift-ovn-kubernetes_ovnkube-control-plane-749d76644c-s8t5c_f8c859be-7650-49fa-a810-1bd096153c33/kube-rbac-proxy/0.log]; will not log again for this container unless duration exceeds 2s Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.186772 4757 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.187115 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.186938 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-dfprv" podUID="0ea2db10-97ca-4173-9766-c34220e3958b" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.186912 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-dfprv" podUID="0ea2db10-97ca-4173-9766-c34220e3958b" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.186879 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-dfprv" podUID="0ea2db10-97ca-4173-9766-c34220e3958b" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.187591 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-5bddd4b946-jwvt9" podUID="dd784875-6828-4554-8791-24182d80b82f" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.51:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.187671 4757 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-45d7q container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.187702 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-5655c58dd6-cgxqx" podUID="04fe3c89-14a7-4830-b290-538d3ae20a12" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.70:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.187711 4757 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-45d7q container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.187845 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" podUID="18370ed0-2552-4394-ab48-5e61b770ad66" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.187956 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-45d7q" podUID="18370ed0-2552-4394-ab48-5e61b770ad66" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.187652 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-5bddd4b946-jwvt9" podUID="dd784875-6828-4554-8791-24182d80b82f" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.51:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.188182 4757 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z52vp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.62:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.188219 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-z52vp" podUID="b40bf055-8b99-4c86-9e45-ed2253aa09a1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.62:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.188263 4757 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z52vp container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.62:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.188292 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-z52vp" podUID="b40bf055-8b99-4c86-9e45-ed2253aa09a1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.62:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.213562 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" podUID="e9f15431-d8cd-408d-8169-e06457cabccc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.83:8081/healthz\": dial tcp 10.217.0.83:8081: connect: connection refused" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.213683 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" podUID="e9f15431-d8cd-408d-8169-e06457cabccc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.83:8081/readyz\": dial tcp 10.217.0.83:8081: connect: connection refused" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.230115 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podUID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.89:8081/healthz\": dial tcp 10.217.0.89:8081: connect: connection refused" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.230201 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podUID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.89:8081/readyz\": dial tcp 10.217.0.89:8081: connect: connection refused" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.251905 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" podUID="6333c537-0505-48c0-b197-a609084a2a2c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/readyz\": dial tcp 10.217.0.77:8081: connect: connection refused" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.251906 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" podUID="6333c537-0505-48c0-b197-a609084a2a2c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/healthz\": dial tcp 10.217.0.77:8081: connect: connection refused" Dec 16 13:34:30 crc kubenswrapper[4757]: E1216 13:34:30.264568 4757 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="3.317s" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.294893 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.294923 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.294932 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.295564 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="manager" containerStatusID={"Type":"cri-o","ID":"4ade807f82acd56ef4e7773356e3af9982ed2216a9f53ba2729b438b82747b0c"} pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" containerMessage="Container manager failed liveness probe, will be restarted" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.295600 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" podUID="6333c537-0505-48c0-b197-a609084a2a2c" containerName="manager" containerID="cri-o://4ade807f82acd56ef4e7773356e3af9982ed2216a9f53ba2729b438b82747b0c" gracePeriod=10 Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.295985 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" podUID="6333c537-0505-48c0-b197-a609084a2a2c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/readyz\": dial tcp 10.217.0.77:8081: connect: connection refused" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.296199 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="manager" containerStatusID={"Type":"cri-o","ID":"873f6a8ede2fa9237135bad178b8e1733170cab1fde491de8c0d92c1f749eb42"} pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" containerMessage="Container manager failed liveness probe, will be restarted" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.296253 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" podUID="e9f15431-d8cd-408d-8169-e06457cabccc" containerName="manager" containerID="cri-o://873f6a8ede2fa9237135bad178b8e1733170cab1fde491de8c0d92c1f749eb42" gracePeriod=10 Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.296292 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" podUID="e9f15431-d8cd-408d-8169-e06457cabccc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.83:8081/readyz\": dial tcp 10.217.0.83:8081: connect: connection refused" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.296547 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podUID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.89:8081/readyz\": dial tcp 10.217.0.89:8081: connect: connection refused" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.296673 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="manager" containerStatusID={"Type":"cri-o","ID":"35a1bd5dd410692a492294e2f3487d2cfd04f9b69a087852025e7589c08d90b9"} pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" containerMessage="Container manager failed liveness probe, will be restarted" Dec 16 13:34:30 crc kubenswrapper[4757]: I1216 13:34:30.296701 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podUID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" containerName="manager" containerID="cri-o://35a1bd5dd410692a492294e2f3487d2cfd04f9b69a087852025e7589c08d90b9" gracePeriod=10 Dec 16 13:34:31 crc kubenswrapper[4757]: I1216 13:34:31.517639 4757 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 16 13:34:31 crc kubenswrapper[4757]: I1216 13:34:31.518068 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 16 13:34:33 crc kubenswrapper[4757]: I1216 13:34:33.357462 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" podUID="67c1a6c7-35d1-48a9-a058-13e5d5599fe7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/readyz\": dial tcp 10.217.0.85:8081: connect: connection refused" Dec 16 13:34:34 crc kubenswrapper[4757]: I1216 13:34:34.125503 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" podUID="019f84e1-6fee-4829-a087-c756c955060a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.47:8080/readyz\": dial tcp 10.217.0.47:8080: connect: connection refused" Dec 16 13:34:36 crc kubenswrapper[4757]: I1216 13:34:36.625275 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" podUID="e9f15431-d8cd-408d-8169-e06457cabccc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.83:8081/readyz\": dial tcp 10.217.0.83:8081: connect: connection refused" Dec 16 13:34:36 crc kubenswrapper[4757]: I1216 13:34:36.889194 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" podUID="6333c537-0505-48c0-b197-a609084a2a2c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/readyz\": dial tcp 10.217.0.77:8081: connect: connection refused" Dec 16 13:34:37 crc kubenswrapper[4757]: I1216 13:34:37.866179 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podUID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.89:8081/readyz\": dial tcp 10.217.0.89:8081: connect: connection refused" Dec 16 13:34:37 crc kubenswrapper[4757]: I1216 13:34:37.963849 4757 generic.go:334] "Generic (PLEG): container finished" podID="67c1a6c7-35d1-48a9-a058-13e5d5599fe7" containerID="470eb1409f4aed957b7b2a9038ea8e32c408a7d6788cc603bb5830d7c3eb700a" exitCode=-1 Dec 16 13:34:37 crc kubenswrapper[4757]: I1216 13:34:37.963962 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" event={"ID":"67c1a6c7-35d1-48a9-a058-13e5d5599fe7","Type":"ContainerDied","Data":"470eb1409f4aed957b7b2a9038ea8e32c408a7d6788cc603bb5830d7c3eb700a"} Dec 16 13:34:41 crc kubenswrapper[4757]: I1216 13:34:41.518204 4757 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 16 13:34:41 crc kubenswrapper[4757]: I1216 13:34:41.518709 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 16 13:34:43 crc kubenswrapper[4757]: I1216 13:34:43.357452 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" podUID="67c1a6c7-35d1-48a9-a058-13e5d5599fe7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/readyz\": dial tcp 10.217.0.85:8081: connect: connection refused" Dec 16 13:34:44 crc kubenswrapper[4757]: I1216 13:34:44.125618 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" podUID="019f84e1-6fee-4829-a087-c756c955060a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.47:8080/readyz\": dial tcp 10.217.0.47:8080: connect: connection refused" Dec 16 13:35:07 crc kubenswrapper[4757]: I1216 13:35:07.364394 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" podUID="6333c537-0505-48c0-b197-a609084a2a2c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/readyz\": dial tcp 10.217.0.77:8081: connect: connection refused" Dec 16 13:35:07 crc kubenswrapper[4757]: I1216 13:35:07.366020 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz" podUID="3717fd56-4339-4ad6-940d-b5023c76d32f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.71:8081/healthz\": dial tcp 10.217.0.71:8081: connect: connection refused" Dec 16 13:35:07 crc kubenswrapper[4757]: I1216 13:35:07.366098 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz" podUID="3717fd56-4339-4ad6-940d-b5023c76d32f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.71:8081/readyz\": dial tcp 10.217.0.71:8081: connect: connection refused" Dec 16 13:35:07 crc kubenswrapper[4757]: I1216 13:35:07.381633 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8" podUID="a6449c1f-3695-445d-90b0-64b4c79cde05" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.72:8081/healthz\": dial tcp 10.217.0.72:8081: connect: connection refused" Dec 16 13:35:07 crc kubenswrapper[4757]: I1216 13:35:07.457307 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8" podUID="a6449c1f-3695-445d-90b0-64b4c79cde05" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.72:8081/readyz\": dial tcp 10.217.0.72:8081: connect: connection refused" Dec 16 13:35:07 crc kubenswrapper[4757]: I1216 13:35:07.498494 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="65a6f73f-1823-407a-a9f2-c693e5ddcca9" containerName="ceilometer-notification-agent" probeResult="failure" output=< Dec 16 13:35:07 crc kubenswrapper[4757]: Unkown error: Expecting value: line 1 column 1 (char 0) Dec 16 13:35:07 crc kubenswrapper[4757]: > Dec 16 13:35:07 crc kubenswrapper[4757]: I1216 13:35:07.527851 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:35:07 crc kubenswrapper[4757]: I1216 13:35:07.527901 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:35:07 crc kubenswrapper[4757]: I1216 13:35:07.528599 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" podUID="67c1a6c7-35d1-48a9-a058-13e5d5599fe7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.85:8081/readyz\": dial tcp 10.217.0.85:8081: connect: connection refused" Dec 16 13:35:07 crc kubenswrapper[4757]: I1216 13:35:07.552212 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" podUID="019f84e1-6fee-4829-a087-c756c955060a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.47:8080/readyz\": dial tcp 10.217.0.47:8080: connect: connection refused" Dec 16 13:35:07 crc kubenswrapper[4757]: I1216 13:35:07.579571 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" podUID="e9f15431-d8cd-408d-8169-e06457cabccc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.83:8081/readyz\": dial tcp 10.217.0.83:8081: connect: connection refused" Dec 16 13:35:07 crc kubenswrapper[4757]: I1216 13:35:07.627396 4757 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 16 13:35:07 crc kubenswrapper[4757]: I1216 13:35:07.627689 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 16 13:35:07 crc kubenswrapper[4757]: I1216 13:35:07.656812 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 16 13:35:07 crc kubenswrapper[4757]: I1216 13:35:07.728607 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podUID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.89:8081/readyz\": dial tcp 10.217.0.89:8081: connect: connection refused" Dec 16 13:35:07 crc kubenswrapper[4757]: E1216 13:35:07.734478 4757 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="22.786s" Dec 16 13:35:07 crc kubenswrapper[4757]: I1216 13:35:07.734899 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 16 13:35:07 crc kubenswrapper[4757]: I1216 13:35:07.734953 4757 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="31cc4cc27906ee7e5415e9f8ee9159e8dbf3106af6cdc17089aebbc68aae2cd6" exitCode=-1 Dec 16 13:35:07 crc kubenswrapper[4757]: I1216 13:35:07.868844 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" podUID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.89:8081/readyz\": dial tcp 10.217.0.89:8081: connect: connection refused" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.025961 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.026369 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.026400 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"31cc4cc27906ee7e5415e9f8ee9159e8dbf3106af6cdc17089aebbc68aae2cd6"} Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.026450 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6x6g8"] Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.033640 4757 scope.go:117] "RemoveContainer" containerID="9cf83d197879a24df816a1800c8c9fe981dc53a81119132602f10e353187693c" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.061940 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6x6g8"] Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.062556 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6x6g8" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.215776 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daba874e-e472-48a3-b4f5-dce301d1438c-utilities\") pod \"community-operators-6x6g8\" (UID: \"daba874e-e472-48a3-b4f5-dce301d1438c\") " pod="openshift-marketplace/community-operators-6x6g8" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.215821 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daba874e-e472-48a3-b4f5-dce301d1438c-catalog-content\") pod \"community-operators-6x6g8\" (UID: \"daba874e-e472-48a3-b4f5-dce301d1438c\") " pod="openshift-marketplace/community-operators-6x6g8" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.221403 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg5x8\" (UniqueName: \"kubernetes.io/projected/daba874e-e472-48a3-b4f5-dce301d1438c-kube-api-access-sg5x8\") pod \"community-operators-6x6g8\" (UID: \"daba874e-e472-48a3-b4f5-dce301d1438c\") " pod="openshift-marketplace/community-operators-6x6g8" Dec 16 13:35:08 crc kubenswrapper[4757]: E1216 13:35:08.285591 4757 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9f15431_d8cd_408d_8169_e06457cabccc.slice/crio-conmon-873f6a8ede2fa9237135bad178b8e1733170cab1fde491de8c0d92c1f749eb42.scope\": RecentStats: unable to find data in memory cache], [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6449c1f_3695_445d_90b0_64b4c79cde05.slice/crio-conmon-0e96a81a682a19775b899ea79d1adc3ab0d7cb597c6276c988e4f7bccffc08de.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3717fd56_4339_4ad6_940d_b5023c76d32f.slice/crio-conmon-0a886290841f66c8a939ca85d74a0da3dd45779a873263fae160e5ab39ced3e5.scope\": RecentStats: unable to find data in memory cache]" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.325284 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daba874e-e472-48a3-b4f5-dce301d1438c-utilities\") pod \"community-operators-6x6g8\" (UID: \"daba874e-e472-48a3-b4f5-dce301d1438c\") " pod="openshift-marketplace/community-operators-6x6g8" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.325462 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daba874e-e472-48a3-b4f5-dce301d1438c-catalog-content\") pod \"community-operators-6x6g8\" (UID: \"daba874e-e472-48a3-b4f5-dce301d1438c\") " pod="openshift-marketplace/community-operators-6x6g8" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.325512 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg5x8\" (UniqueName: \"kubernetes.io/projected/daba874e-e472-48a3-b4f5-dce301d1438c-kube-api-access-sg5x8\") pod \"community-operators-6x6g8\" (UID: \"daba874e-e472-48a3-b4f5-dce301d1438c\") " pod="openshift-marketplace/community-operators-6x6g8" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.326069 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daba874e-e472-48a3-b4f5-dce301d1438c-utilities\") pod \"community-operators-6x6g8\" (UID: \"daba874e-e472-48a3-b4f5-dce301d1438c\") " pod="openshift-marketplace/community-operators-6x6g8" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.326195 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daba874e-e472-48a3-b4f5-dce301d1438c-catalog-content\") pod \"community-operators-6x6g8\" (UID: \"daba874e-e472-48a3-b4f5-dce301d1438c\") " pod="openshift-marketplace/community-operators-6x6g8" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.349452 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg5x8\" (UniqueName: \"kubernetes.io/projected/daba874e-e472-48a3-b4f5-dce301d1438c-kube-api-access-sg5x8\") pod \"community-operators-6x6g8\" (UID: \"daba874e-e472-48a3-b4f5-dce301d1438c\") " pod="openshift-marketplace/community-operators-6x6g8" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.677873 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6x6g8" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.749339 4757 generic.go:334] "Generic (PLEG): container finished" podID="a6449c1f-3695-445d-90b0-64b4c79cde05" containerID="0e96a81a682a19775b899ea79d1adc3ab0d7cb597c6276c988e4f7bccffc08de" exitCode=1 Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.749578 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8" event={"ID":"a6449c1f-3695-445d-90b0-64b4c79cde05","Type":"ContainerDied","Data":"0e96a81a682a19775b899ea79d1adc3ab0d7cb597c6276c988e4f7bccffc08de"} Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.750694 4757 scope.go:117] "RemoveContainer" containerID="0e96a81a682a19775b899ea79d1adc3ab0d7cb597c6276c988e4f7bccffc08de" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.752494 4757 generic.go:334] "Generic (PLEG): container finished" podID="6333c537-0505-48c0-b197-a609084a2a2c" containerID="4ade807f82acd56ef4e7773356e3af9982ed2216a9f53ba2729b438b82747b0c" exitCode=1 Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.752537 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" event={"ID":"6333c537-0505-48c0-b197-a609084a2a2c","Type":"ContainerDied","Data":"4ade807f82acd56ef4e7773356e3af9982ed2216a9f53ba2729b438b82747b0c"} Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.764551 4757 generic.go:334] "Generic (PLEG): container finished" podID="019f84e1-6fee-4829-a087-c756c955060a" containerID="3e0be9ae6845db7ec969b655807af38b8cc48f14eba930237001d942424668c3" exitCode=1 Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.764656 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" event={"ID":"019f84e1-6fee-4829-a087-c756c955060a","Type":"ContainerDied","Data":"3e0be9ae6845db7ec969b655807af38b8cc48f14eba930237001d942424668c3"} Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.765586 4757 scope.go:117] "RemoveContainer" containerID="3e0be9ae6845db7ec969b655807af38b8cc48f14eba930237001d942424668c3" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.774563 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.794256 4757 generic.go:334] "Generic (PLEG): container finished" podID="3717fd56-4339-4ad6-940d-b5023c76d32f" containerID="0a886290841f66c8a939ca85d74a0da3dd45779a873263fae160e5ab39ced3e5" exitCode=1 Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.794338 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz" event={"ID":"3717fd56-4339-4ad6-940d-b5023c76d32f","Type":"ContainerDied","Data":"0a886290841f66c8a939ca85d74a0da3dd45779a873263fae160e5ab39ced3e5"} Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.797668 4757 scope.go:117] "RemoveContainer" containerID="0a886290841f66c8a939ca85d74a0da3dd45779a873263fae160e5ab39ced3e5" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.804103 4757 generic.go:334] "Generic (PLEG): container finished" podID="e9f15431-d8cd-408d-8169-e06457cabccc" containerID="873f6a8ede2fa9237135bad178b8e1733170cab1fde491de8c0d92c1f749eb42" exitCode=1 Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.804180 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" event={"ID":"e9f15431-d8cd-408d-8169-e06457cabccc","Type":"ContainerDied","Data":"873f6a8ede2fa9237135bad178b8e1733170cab1fde491de8c0d92c1f749eb42"} Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.812849 4757 generic.go:334] "Generic (PLEG): container finished" podID="42d952f0-a650-484d-9e6b-b1c6c0f252dc" containerID="35a1bd5dd410692a492294e2f3487d2cfd04f9b69a087852025e7589c08d90b9" exitCode=1 Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.813035 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" event={"ID":"42d952f0-a650-484d-9e6b-b1c6c0f252dc","Type":"ContainerDied","Data":"35a1bd5dd410692a492294e2f3487d2cfd04f9b69a087852025e7589c08d90b9"} Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.852689 4757 generic.go:334] "Generic (PLEG): container finished" podID="65a6f73f-1823-407a-a9f2-c693e5ddcca9" containerID="efccb4b3e660304aaaf43832f8e13cbebaf5a9f019ac85d25fea4b90b97c3966" exitCode=137 Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.852790 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a6f73f-1823-407a-a9f2-c693e5ddcca9","Type":"ContainerDied","Data":"efccb4b3e660304aaaf43832f8e13cbebaf5a9f019ac85d25fea4b90b97c3966"} Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.865312 4757 generic.go:334] "Generic (PLEG): container finished" podID="15380341-a6d5-4aa4-8447-05b61eaa936b" containerID="fcaa2a8813e3c37e8dcfd08cd0c5a38e041c8a5364cd8be9b675d52dc1baec63" exitCode=0 Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.866453 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.866524 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" gracePeriod=600 Dec 16 13:35:08 crc kubenswrapper[4757]: I1216 13:35:08.866650 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpnfn" event={"ID":"15380341-a6d5-4aa4-8447-05b61eaa936b","Type":"ContainerDied","Data":"fcaa2a8813e3c37e8dcfd08cd0c5a38e041c8a5364cd8be9b675d52dc1baec63"} Dec 16 13:35:09 crc kubenswrapper[4757]: E1216 13:35:09.012904 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.295306 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6x6g8"] Dec 16 13:35:09 crc kubenswrapper[4757]: W1216 13:35:09.301199 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaba874e_e472_48a3_b4f5_dce301d1438c.slice/crio-2fd9a7ea9e42bf721455fa4a9d5b3e63b561e600656dc222b0512d2472945ab8 WatchSource:0}: Error finding container 2fd9a7ea9e42bf721455fa4a9d5b3e63b561e600656dc222b0512d2472945ab8: Status 404 returned error can't find the container with id 2fd9a7ea9e42bf721455fa4a9d5b3e63b561e600656dc222b0512d2472945ab8 Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.885510 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz" event={"ID":"3717fd56-4339-4ad6-940d-b5023c76d32f","Type":"ContainerStarted","Data":"5f7f2a902f89f1a450a8f0ce71b5d9a2e0f72e3bb13e755280ed2e151b60eefe"} Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.886858 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz" Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.890200 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" event={"ID":"6333c537-0505-48c0-b197-a609084a2a2c","Type":"ContainerStarted","Data":"53278c36f00ca3f828847f4b53a2fa1438bb431a26fb81a1f73dd3e889dbf304"} Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.890339 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.893393 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" event={"ID":"019f84e1-6fee-4829-a087-c756c955060a","Type":"ContainerStarted","Data":"e405986ffb102a19c84105073e0cb8f50ac4fb7cffb2fc0c41e426e891fa4ee0"} Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.893794 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.899667 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.905444 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f9c78ffc4b004dbf52dc4dcecb864bd22d74818ec1b049fb94444be61a15cbb1"} Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.930549 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a6f73f-1823-407a-a9f2-c693e5ddcca9","Type":"ContainerStarted","Data":"d0a6753222f9ab38cfa447c7abe341073351af841179bf93c4f8c6378deece59"} Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.931427 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-notification-agent" containerStatusID={"Type":"cri-o","ID":"dec1770f5c192f62e118934bead7638fcc2c0688f9c70c02c9368ca807f35240"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-notification-agent failed liveness probe, will be restarted" Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.931515 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65a6f73f-1823-407a-a9f2-c693e5ddcca9" containerName="ceilometer-notification-agent" containerID="cri-o://dec1770f5c192f62e118934bead7638fcc2c0688f9c70c02c9368ca807f35240" gracePeriod=30 Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.936821 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpnfn" event={"ID":"15380341-a6d5-4aa4-8447-05b61eaa936b","Type":"ContainerStarted","Data":"644ef831a9715368e33374662c7a1b3fe725782f0818a560447f060b8db901e0"} Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.956480 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" exitCode=0 Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.956584 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec"} Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.956626 4757 scope.go:117] "RemoveContainer" containerID="37a145676157cb0ad283f5e08ca036f53f30097a029365f1169b057c03bf3c30" Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.957458 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:35:09 crc kubenswrapper[4757]: E1216 13:35:09.957785 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.965644 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" event={"ID":"e9f15431-d8cd-408d-8169-e06457cabccc","Type":"ContainerStarted","Data":"bfec2daca54140ec3ecfd6a518c9f32fe831b98908d678962f7059152f7055ca"} Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.966687 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.973896 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" event={"ID":"42d952f0-a650-484d-9e6b-b1c6c0f252dc","Type":"ContainerStarted","Data":"9be2f95fe9b86df5c3ec63dcac0d5f146230bcc5bc89b746a49ed767e98512d4"} Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.974226 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.994886 4757 generic.go:334] "Generic (PLEG): container finished" podID="daba874e-e472-48a3-b4f5-dce301d1438c" containerID="934ba30338139d6851dbe807f98201e98c861c15b68ec0e36bb305bd97ecfde1" exitCode=0 Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.995028 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6x6g8" event={"ID":"daba874e-e472-48a3-b4f5-dce301d1438c","Type":"ContainerDied","Data":"934ba30338139d6851dbe807f98201e98c861c15b68ec0e36bb305bd97ecfde1"} Dec 16 13:35:09 crc kubenswrapper[4757]: I1216 13:35:09.995065 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6x6g8" event={"ID":"daba874e-e472-48a3-b4f5-dce301d1438c","Type":"ContainerStarted","Data":"2fd9a7ea9e42bf721455fa4a9d5b3e63b561e600656dc222b0512d2472945ab8"} Dec 16 13:35:10 crc kubenswrapper[4757]: I1216 13:35:10.064459 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" event={"ID":"67c1a6c7-35d1-48a9-a058-13e5d5599fe7","Type":"ContainerStarted","Data":"6c497d6a78f3eba8c0c54bb442af8327f3bcd22b1c8c9a474573c07f3fe55107"} Dec 16 13:35:10 crc kubenswrapper[4757]: I1216 13:35:10.066151 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:35:10 crc kubenswrapper[4757]: I1216 13:35:10.090958 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8" event={"ID":"a6449c1f-3695-445d-90b0-64b4c79cde05","Type":"ContainerStarted","Data":"0c63b9bf81ef5fe0db93789a4041e10ee47a2f24f704bbcb8c48f2100b77fee2"} Dec 16 13:35:10 crc kubenswrapper[4757]: I1216 13:35:10.092108 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8" Dec 16 13:35:11 crc kubenswrapper[4757]: I1216 13:35:11.114297 4757 generic.go:334] "Generic (PLEG): container finished" podID="15380341-a6d5-4aa4-8447-05b61eaa936b" containerID="644ef831a9715368e33374662c7a1b3fe725782f0818a560447f060b8db901e0" exitCode=0 Dec 16 13:35:11 crc kubenswrapper[4757]: I1216 13:35:11.114441 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpnfn" event={"ID":"15380341-a6d5-4aa4-8447-05b61eaa936b","Type":"ContainerDied","Data":"644ef831a9715368e33374662c7a1b3fe725782f0818a560447f060b8db901e0"} Dec 16 13:35:11 crc kubenswrapper[4757]: I1216 13:35:11.517388 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 13:35:12 crc kubenswrapper[4757]: I1216 13:35:12.135774 4757 generic.go:334] "Generic (PLEG): container finished" podID="daba874e-e472-48a3-b4f5-dce301d1438c" containerID="336a5bf765a51a2403985cb0b1104704464640bf82a62810fcb5bb6b67d0833b" exitCode=0 Dec 16 13:35:12 crc kubenswrapper[4757]: I1216 13:35:12.135986 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6x6g8" event={"ID":"daba874e-e472-48a3-b4f5-dce301d1438c","Type":"ContainerDied","Data":"336a5bf765a51a2403985cb0b1104704464640bf82a62810fcb5bb6b67d0833b"} Dec 16 13:35:12 crc kubenswrapper[4757]: I1216 13:35:12.145032 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpnfn" event={"ID":"15380341-a6d5-4aa4-8447-05b61eaa936b","Type":"ContainerStarted","Data":"50f2db9ead6cc5a452b7b205433cadba21e7f4993bf41d7f4f9ae269db96c97e"} Dec 16 13:35:12 crc kubenswrapper[4757]: I1216 13:35:12.193485 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gpnfn" podStartSLOduration=99.170506075 podStartE2EDuration="1m42.193456278s" podCreationTimestamp="2025-12-16 13:33:30 +0000 UTC" firstStartedPulling="2025-12-16 13:35:08.868796632 +0000 UTC m=+2894.296540428" lastFinishedPulling="2025-12-16 13:35:11.891746835 +0000 UTC m=+2897.319490631" observedRunningTime="2025-12-16 13:35:12.187524991 +0000 UTC m=+2897.615268787" watchObservedRunningTime="2025-12-16 13:35:12.193456278 +0000 UTC m=+2897.621200084" Dec 16 13:35:13 crc kubenswrapper[4757]: I1216 13:35:13.154751 4757 generic.go:334] "Generic (PLEG): container finished" podID="65a6f73f-1823-407a-a9f2-c693e5ddcca9" containerID="dec1770f5c192f62e118934bead7638fcc2c0688f9c70c02c9368ca807f35240" exitCode=0 Dec 16 13:35:13 crc kubenswrapper[4757]: I1216 13:35:13.154817 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a6f73f-1823-407a-a9f2-c693e5ddcca9","Type":"ContainerDied","Data":"dec1770f5c192f62e118934bead7638fcc2c0688f9c70c02c9368ca807f35240"} Dec 16 13:35:13 crc kubenswrapper[4757]: I1216 13:35:13.159047 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6x6g8" event={"ID":"daba874e-e472-48a3-b4f5-dce301d1438c","Type":"ContainerStarted","Data":"3bef45e6eafe7559d9b4dc5255b0cdf800b006c8a6d7711cac40676f7e504a1e"} Dec 16 13:35:13 crc kubenswrapper[4757]: I1216 13:35:13.200371 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6x6g8" podStartSLOduration=7.354235548 podStartE2EDuration="10.200319796s" podCreationTimestamp="2025-12-16 13:35:03 +0000 UTC" firstStartedPulling="2025-12-16 13:35:10.066336825 +0000 UTC m=+2895.494080621" lastFinishedPulling="2025-12-16 13:35:12.912421073 +0000 UTC m=+2898.340164869" observedRunningTime="2025-12-16 13:35:13.188689709 +0000 UTC m=+2898.616433505" watchObservedRunningTime="2025-12-16 13:35:13.200319796 +0000 UTC m=+2898.628063592" Dec 16 13:35:13 crc kubenswrapper[4757]: I1216 13:35:13.365422 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9" Dec 16 13:35:14 crc kubenswrapper[4757]: I1216 13:35:14.171523 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65a6f73f-1823-407a-a9f2-c693e5ddcca9","Type":"ContainerStarted","Data":"a8cff7b570031584406a5d35af78344497eabdac103347bd3c497c6937d39a3d"} Dec 16 13:35:16 crc kubenswrapper[4757]: I1216 13:35:16.249136 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-t9vm8" Dec 16 13:35:16 crc kubenswrapper[4757]: I1216 13:35:16.291230 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 13:35:16 crc kubenswrapper[4757]: I1216 13:35:16.296890 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 13:35:16 crc kubenswrapper[4757]: I1216 13:35:16.512659 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-4z9jz" Dec 16 13:35:16 crc kubenswrapper[4757]: I1216 13:35:16.627552 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-dxgr9" Dec 16 13:35:16 crc kubenswrapper[4757]: I1216 13:35:16.891077 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-45bgz" Dec 16 13:35:17 crc kubenswrapper[4757]: I1216 13:35:17.868316 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-w6kx8" Dec 16 13:35:18 crc kubenswrapper[4757]: E1216 13:35:18.518361 4757 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9f15431_d8cd_408d_8169_e06457cabccc.slice/crio-conmon-873f6a8ede2fa9237135bad178b8e1733170cab1fde491de8c0d92c1f749eb42.scope\": RecentStats: unable to find data in memory cache]" Dec 16 13:35:18 crc kubenswrapper[4757]: I1216 13:35:18.679113 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6x6g8" Dec 16 13:35:18 crc kubenswrapper[4757]: I1216 13:35:18.679235 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6x6g8" Dec 16 13:35:18 crc kubenswrapper[4757]: I1216 13:35:18.770065 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6x6g8" Dec 16 13:35:19 crc kubenswrapper[4757]: I1216 13:35:19.263136 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6x6g8" Dec 16 13:35:19 crc kubenswrapper[4757]: I1216 13:35:19.323707 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6x6g8"] Dec 16 13:35:20 crc kubenswrapper[4757]: I1216 13:35:20.949154 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:35:20 crc kubenswrapper[4757]: E1216 13:35:20.949870 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:35:20 crc kubenswrapper[4757]: I1216 13:35:20.974945 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gpnfn" Dec 16 13:35:20 crc kubenswrapper[4757]: I1216 13:35:20.974985 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gpnfn" Dec 16 13:35:21 crc kubenswrapper[4757]: I1216 13:35:21.028996 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gpnfn" Dec 16 13:35:21 crc kubenswrapper[4757]: I1216 13:35:21.231250 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6x6g8" podUID="daba874e-e472-48a3-b4f5-dce301d1438c" containerName="registry-server" containerID="cri-o://3bef45e6eafe7559d9b4dc5255b0cdf800b006c8a6d7711cac40676f7e504a1e" gracePeriod=2 Dec 16 13:35:21 crc kubenswrapper[4757]: I1216 13:35:21.286437 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gpnfn" Dec 16 13:35:21 crc kubenswrapper[4757]: I1216 13:35:21.521575 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 13:35:21 crc kubenswrapper[4757]: I1216 13:35:21.696534 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6x6g8" Dec 16 13:35:21 crc kubenswrapper[4757]: I1216 13:35:21.838201 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daba874e-e472-48a3-b4f5-dce301d1438c-utilities\") pod \"daba874e-e472-48a3-b4f5-dce301d1438c\" (UID: \"daba874e-e472-48a3-b4f5-dce301d1438c\") " Dec 16 13:35:21 crc kubenswrapper[4757]: I1216 13:35:21.838314 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg5x8\" (UniqueName: \"kubernetes.io/projected/daba874e-e472-48a3-b4f5-dce301d1438c-kube-api-access-sg5x8\") pod \"daba874e-e472-48a3-b4f5-dce301d1438c\" (UID: \"daba874e-e472-48a3-b4f5-dce301d1438c\") " Dec 16 13:35:21 crc kubenswrapper[4757]: I1216 13:35:21.838432 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daba874e-e472-48a3-b4f5-dce301d1438c-catalog-content\") pod \"daba874e-e472-48a3-b4f5-dce301d1438c\" (UID: \"daba874e-e472-48a3-b4f5-dce301d1438c\") " Dec 16 13:35:21 crc kubenswrapper[4757]: I1216 13:35:21.839044 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daba874e-e472-48a3-b4f5-dce301d1438c-utilities" (OuterVolumeSpecName: "utilities") pod "daba874e-e472-48a3-b4f5-dce301d1438c" (UID: "daba874e-e472-48a3-b4f5-dce301d1438c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:35:21 crc kubenswrapper[4757]: I1216 13:35:21.859391 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daba874e-e472-48a3-b4f5-dce301d1438c-kube-api-access-sg5x8" (OuterVolumeSpecName: "kube-api-access-sg5x8") pod "daba874e-e472-48a3-b4f5-dce301d1438c" (UID: "daba874e-e472-48a3-b4f5-dce301d1438c"). InnerVolumeSpecName "kube-api-access-sg5x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:35:21 crc kubenswrapper[4757]: I1216 13:35:21.936350 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daba874e-e472-48a3-b4f5-dce301d1438c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "daba874e-e472-48a3-b4f5-dce301d1438c" (UID: "daba874e-e472-48a3-b4f5-dce301d1438c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:35:21 crc kubenswrapper[4757]: I1216 13:35:21.941607 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daba874e-e472-48a3-b4f5-dce301d1438c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:35:21 crc kubenswrapper[4757]: I1216 13:35:21.941808 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daba874e-e472-48a3-b4f5-dce301d1438c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:35:21 crc kubenswrapper[4757]: I1216 13:35:21.941887 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg5x8\" (UniqueName: \"kubernetes.io/projected/daba874e-e472-48a3-b4f5-dce301d1438c-kube-api-access-sg5x8\") on node \"crc\" DevicePath \"\"" Dec 16 13:35:22 crc kubenswrapper[4757]: I1216 13:35:22.244961 4757 generic.go:334] "Generic (PLEG): container finished" podID="daba874e-e472-48a3-b4f5-dce301d1438c" containerID="3bef45e6eafe7559d9b4dc5255b0cdf800b006c8a6d7711cac40676f7e504a1e" exitCode=0 Dec 16 13:35:22 crc kubenswrapper[4757]: I1216 13:35:22.245040 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6x6g8" Dec 16 13:35:22 crc kubenswrapper[4757]: I1216 13:35:22.245040 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6x6g8" event={"ID":"daba874e-e472-48a3-b4f5-dce301d1438c","Type":"ContainerDied","Data":"3bef45e6eafe7559d9b4dc5255b0cdf800b006c8a6d7711cac40676f7e504a1e"} Dec 16 13:35:22 crc kubenswrapper[4757]: I1216 13:35:22.245468 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6x6g8" event={"ID":"daba874e-e472-48a3-b4f5-dce301d1438c","Type":"ContainerDied","Data":"2fd9a7ea9e42bf721455fa4a9d5b3e63b561e600656dc222b0512d2472945ab8"} Dec 16 13:35:22 crc kubenswrapper[4757]: I1216 13:35:22.245492 4757 scope.go:117] "RemoveContainer" containerID="3bef45e6eafe7559d9b4dc5255b0cdf800b006c8a6d7711cac40676f7e504a1e" Dec 16 13:35:22 crc kubenswrapper[4757]: I1216 13:35:22.269636 4757 scope.go:117] "RemoveContainer" containerID="336a5bf765a51a2403985cb0b1104704464640bf82a62810fcb5bb6b67d0833b" Dec 16 13:35:22 crc kubenswrapper[4757]: I1216 13:35:22.298763 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6x6g8"] Dec 16 13:35:22 crc kubenswrapper[4757]: I1216 13:35:22.307837 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6x6g8"] Dec 16 13:35:22 crc kubenswrapper[4757]: I1216 13:35:22.312483 4757 scope.go:117] "RemoveContainer" containerID="934ba30338139d6851dbe807f98201e98c861c15b68ec0e36bb305bd97ecfde1" Dec 16 13:35:22 crc kubenswrapper[4757]: I1216 13:35:22.356790 4757 scope.go:117] "RemoveContainer" containerID="3bef45e6eafe7559d9b4dc5255b0cdf800b006c8a6d7711cac40676f7e504a1e" Dec 16 13:35:22 crc kubenswrapper[4757]: E1216 13:35:22.357329 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bef45e6eafe7559d9b4dc5255b0cdf800b006c8a6d7711cac40676f7e504a1e\": container with ID starting with 3bef45e6eafe7559d9b4dc5255b0cdf800b006c8a6d7711cac40676f7e504a1e not found: ID does not exist" containerID="3bef45e6eafe7559d9b4dc5255b0cdf800b006c8a6d7711cac40676f7e504a1e" Dec 16 13:35:22 crc kubenswrapper[4757]: I1216 13:35:22.357382 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bef45e6eafe7559d9b4dc5255b0cdf800b006c8a6d7711cac40676f7e504a1e"} err="failed to get container status \"3bef45e6eafe7559d9b4dc5255b0cdf800b006c8a6d7711cac40676f7e504a1e\": rpc error: code = NotFound desc = could not find container \"3bef45e6eafe7559d9b4dc5255b0cdf800b006c8a6d7711cac40676f7e504a1e\": container with ID starting with 3bef45e6eafe7559d9b4dc5255b0cdf800b006c8a6d7711cac40676f7e504a1e not found: ID does not exist" Dec 16 13:35:22 crc kubenswrapper[4757]: I1216 13:35:22.357446 4757 scope.go:117] "RemoveContainer" containerID="336a5bf765a51a2403985cb0b1104704464640bf82a62810fcb5bb6b67d0833b" Dec 16 13:35:22 crc kubenswrapper[4757]: E1216 13:35:22.357723 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"336a5bf765a51a2403985cb0b1104704464640bf82a62810fcb5bb6b67d0833b\": container with ID starting with 336a5bf765a51a2403985cb0b1104704464640bf82a62810fcb5bb6b67d0833b not found: ID does not exist" containerID="336a5bf765a51a2403985cb0b1104704464640bf82a62810fcb5bb6b67d0833b" Dec 16 13:35:22 crc kubenswrapper[4757]: I1216 13:35:22.357756 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"336a5bf765a51a2403985cb0b1104704464640bf82a62810fcb5bb6b67d0833b"} err="failed to get container status \"336a5bf765a51a2403985cb0b1104704464640bf82a62810fcb5bb6b67d0833b\": rpc error: code = NotFound desc = could not find container \"336a5bf765a51a2403985cb0b1104704464640bf82a62810fcb5bb6b67d0833b\": container with ID starting with 336a5bf765a51a2403985cb0b1104704464640bf82a62810fcb5bb6b67d0833b not found: ID does not exist" Dec 16 13:35:22 crc kubenswrapper[4757]: I1216 13:35:22.357773 4757 scope.go:117] "RemoveContainer" containerID="934ba30338139d6851dbe807f98201e98c861c15b68ec0e36bb305bd97ecfde1" Dec 16 13:35:22 crc kubenswrapper[4757]: E1216 13:35:22.358143 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"934ba30338139d6851dbe807f98201e98c861c15b68ec0e36bb305bd97ecfde1\": container with ID starting with 934ba30338139d6851dbe807f98201e98c861c15b68ec0e36bb305bd97ecfde1 not found: ID does not exist" containerID="934ba30338139d6851dbe807f98201e98c861c15b68ec0e36bb305bd97ecfde1" Dec 16 13:35:22 crc kubenswrapper[4757]: I1216 13:35:22.358174 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"934ba30338139d6851dbe807f98201e98c861c15b68ec0e36bb305bd97ecfde1"} err="failed to get container status \"934ba30338139d6851dbe807f98201e98c861c15b68ec0e36bb305bd97ecfde1\": rpc error: code = NotFound desc = could not find container \"934ba30338139d6851dbe807f98201e98c861c15b68ec0e36bb305bd97ecfde1\": container with ID starting with 934ba30338139d6851dbe807f98201e98c861c15b68ec0e36bb305bd97ecfde1 not found: ID does not exist" Dec 16 13:35:22 crc kubenswrapper[4757]: I1216 13:35:22.961196 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daba874e-e472-48a3-b4f5-dce301d1438c" path="/var/lib/kubelet/pods/daba874e-e472-48a3-b4f5-dce301d1438c/volumes" Dec 16 13:35:23 crc kubenswrapper[4757]: I1216 13:35:23.015168 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpnfn"] Dec 16 13:35:23 crc kubenswrapper[4757]: I1216 13:35:23.255373 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gpnfn" podUID="15380341-a6d5-4aa4-8447-05b61eaa936b" containerName="registry-server" containerID="cri-o://50f2db9ead6cc5a452b7b205433cadba21e7f4993bf41d7f4f9ae269db96c97e" gracePeriod=2 Dec 16 13:35:23 crc kubenswrapper[4757]: I1216 13:35:23.662870 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gpnfn" Dec 16 13:35:23 crc kubenswrapper[4757]: I1216 13:35:23.774807 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15380341-a6d5-4aa4-8447-05b61eaa936b-catalog-content\") pod \"15380341-a6d5-4aa4-8447-05b61eaa936b\" (UID: \"15380341-a6d5-4aa4-8447-05b61eaa936b\") " Dec 16 13:35:23 crc kubenswrapper[4757]: I1216 13:35:23.774957 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm6h9\" (UniqueName: \"kubernetes.io/projected/15380341-a6d5-4aa4-8447-05b61eaa936b-kube-api-access-rm6h9\") pod \"15380341-a6d5-4aa4-8447-05b61eaa936b\" (UID: \"15380341-a6d5-4aa4-8447-05b61eaa936b\") " Dec 16 13:35:23 crc kubenswrapper[4757]: I1216 13:35:23.775064 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15380341-a6d5-4aa4-8447-05b61eaa936b-utilities\") pod \"15380341-a6d5-4aa4-8447-05b61eaa936b\" (UID: \"15380341-a6d5-4aa4-8447-05b61eaa936b\") " Dec 16 13:35:23 crc kubenswrapper[4757]: I1216 13:35:23.776176 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15380341-a6d5-4aa4-8447-05b61eaa936b-utilities" (OuterVolumeSpecName: "utilities") pod "15380341-a6d5-4aa4-8447-05b61eaa936b" (UID: "15380341-a6d5-4aa4-8447-05b61eaa936b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:35:23 crc kubenswrapper[4757]: I1216 13:35:23.790208 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15380341-a6d5-4aa4-8447-05b61eaa936b-kube-api-access-rm6h9" (OuterVolumeSpecName: "kube-api-access-rm6h9") pod "15380341-a6d5-4aa4-8447-05b61eaa936b" (UID: "15380341-a6d5-4aa4-8447-05b61eaa936b"). InnerVolumeSpecName "kube-api-access-rm6h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:35:23 crc kubenswrapper[4757]: I1216 13:35:23.791457 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15380341-a6d5-4aa4-8447-05b61eaa936b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15380341-a6d5-4aa4-8447-05b61eaa936b" (UID: "15380341-a6d5-4aa4-8447-05b61eaa936b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:35:23 crc kubenswrapper[4757]: I1216 13:35:23.877370 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15380341-a6d5-4aa4-8447-05b61eaa936b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:35:23 crc kubenswrapper[4757]: I1216 13:35:23.877403 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm6h9\" (UniqueName: \"kubernetes.io/projected/15380341-a6d5-4aa4-8447-05b61eaa936b-kube-api-access-rm6h9\") on node \"crc\" DevicePath \"\"" Dec 16 13:35:23 crc kubenswrapper[4757]: I1216 13:35:23.877416 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15380341-a6d5-4aa4-8447-05b61eaa936b-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:35:24 crc kubenswrapper[4757]: I1216 13:35:24.274142 4757 generic.go:334] "Generic (PLEG): container finished" podID="15380341-a6d5-4aa4-8447-05b61eaa936b" containerID="50f2db9ead6cc5a452b7b205433cadba21e7f4993bf41d7f4f9ae269db96c97e" exitCode=0 Dec 16 13:35:24 crc kubenswrapper[4757]: I1216 13:35:24.274189 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpnfn" event={"ID":"15380341-a6d5-4aa4-8447-05b61eaa936b","Type":"ContainerDied","Data":"50f2db9ead6cc5a452b7b205433cadba21e7f4993bf41d7f4f9ae269db96c97e"} Dec 16 13:35:24 crc kubenswrapper[4757]: I1216 13:35:24.274214 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gpnfn" event={"ID":"15380341-a6d5-4aa4-8447-05b61eaa936b","Type":"ContainerDied","Data":"47132003a947ed661ffe43be48de291d893ed3ef9f1d09ae62cfbe5e5cc2f578"} Dec 16 13:35:24 crc kubenswrapper[4757]: I1216 13:35:24.274234 4757 scope.go:117] "RemoveContainer" containerID="50f2db9ead6cc5a452b7b205433cadba21e7f4993bf41d7f4f9ae269db96c97e" Dec 16 13:35:24 crc kubenswrapper[4757]: I1216 13:35:24.274834 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gpnfn" Dec 16 13:35:24 crc kubenswrapper[4757]: I1216 13:35:24.311381 4757 scope.go:117] "RemoveContainer" containerID="644ef831a9715368e33374662c7a1b3fe725782f0818a560447f060b8db901e0" Dec 16 13:35:24 crc kubenswrapper[4757]: I1216 13:35:24.340386 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpnfn"] Dec 16 13:35:24 crc kubenswrapper[4757]: I1216 13:35:24.350041 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gpnfn"] Dec 16 13:35:24 crc kubenswrapper[4757]: I1216 13:35:24.359274 4757 scope.go:117] "RemoveContainer" containerID="fcaa2a8813e3c37e8dcfd08cd0c5a38e041c8a5364cd8be9b675d52dc1baec63" Dec 16 13:35:24 crc kubenswrapper[4757]: I1216 13:35:24.398432 4757 scope.go:117] "RemoveContainer" containerID="50f2db9ead6cc5a452b7b205433cadba21e7f4993bf41d7f4f9ae269db96c97e" Dec 16 13:35:24 crc kubenswrapper[4757]: E1216 13:35:24.399095 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50f2db9ead6cc5a452b7b205433cadba21e7f4993bf41d7f4f9ae269db96c97e\": container with ID starting with 50f2db9ead6cc5a452b7b205433cadba21e7f4993bf41d7f4f9ae269db96c97e not found: ID does not exist" containerID="50f2db9ead6cc5a452b7b205433cadba21e7f4993bf41d7f4f9ae269db96c97e" Dec 16 13:35:24 crc kubenswrapper[4757]: I1216 13:35:24.399194 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50f2db9ead6cc5a452b7b205433cadba21e7f4993bf41d7f4f9ae269db96c97e"} err="failed to get container status \"50f2db9ead6cc5a452b7b205433cadba21e7f4993bf41d7f4f9ae269db96c97e\": rpc error: code = NotFound desc = could not find container \"50f2db9ead6cc5a452b7b205433cadba21e7f4993bf41d7f4f9ae269db96c97e\": container with ID starting with 50f2db9ead6cc5a452b7b205433cadba21e7f4993bf41d7f4f9ae269db96c97e not found: ID does not exist" Dec 16 13:35:24 crc kubenswrapper[4757]: I1216 13:35:24.399273 4757 scope.go:117] "RemoveContainer" containerID="644ef831a9715368e33374662c7a1b3fe725782f0818a560447f060b8db901e0" Dec 16 13:35:24 crc kubenswrapper[4757]: E1216 13:35:24.399779 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"644ef831a9715368e33374662c7a1b3fe725782f0818a560447f060b8db901e0\": container with ID starting with 644ef831a9715368e33374662c7a1b3fe725782f0818a560447f060b8db901e0 not found: ID does not exist" containerID="644ef831a9715368e33374662c7a1b3fe725782f0818a560447f060b8db901e0" Dec 16 13:35:24 crc kubenswrapper[4757]: I1216 13:35:24.399848 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"644ef831a9715368e33374662c7a1b3fe725782f0818a560447f060b8db901e0"} err="failed to get container status \"644ef831a9715368e33374662c7a1b3fe725782f0818a560447f060b8db901e0\": rpc error: code = NotFound desc = could not find container \"644ef831a9715368e33374662c7a1b3fe725782f0818a560447f060b8db901e0\": container with ID starting with 644ef831a9715368e33374662c7a1b3fe725782f0818a560447f060b8db901e0 not found: ID does not exist" Dec 16 13:35:24 crc kubenswrapper[4757]: I1216 13:35:24.399893 4757 scope.go:117] "RemoveContainer" containerID="fcaa2a8813e3c37e8dcfd08cd0c5a38e041c8a5364cd8be9b675d52dc1baec63" Dec 16 13:35:24 crc kubenswrapper[4757]: E1216 13:35:24.400434 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcaa2a8813e3c37e8dcfd08cd0c5a38e041c8a5364cd8be9b675d52dc1baec63\": container with ID starting with fcaa2a8813e3c37e8dcfd08cd0c5a38e041c8a5364cd8be9b675d52dc1baec63 not found: ID does not exist" containerID="fcaa2a8813e3c37e8dcfd08cd0c5a38e041c8a5364cd8be9b675d52dc1baec63" Dec 16 13:35:24 crc kubenswrapper[4757]: I1216 13:35:24.400519 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcaa2a8813e3c37e8dcfd08cd0c5a38e041c8a5364cd8be9b675d52dc1baec63"} err="failed to get container status \"fcaa2a8813e3c37e8dcfd08cd0c5a38e041c8a5364cd8be9b675d52dc1baec63\": rpc error: code = NotFound desc = could not find container \"fcaa2a8813e3c37e8dcfd08cd0c5a38e041c8a5364cd8be9b675d52dc1baec63\": container with ID starting with fcaa2a8813e3c37e8dcfd08cd0c5a38e041c8a5364cd8be9b675d52dc1baec63 not found: ID does not exist" Dec 16 13:35:24 crc kubenswrapper[4757]: I1216 13:35:24.959694 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15380341-a6d5-4aa4-8447-05b61eaa936b" path="/var/lib/kubelet/pods/15380341-a6d5-4aa4-8447-05b61eaa936b/volumes" Dec 16 13:35:28 crc kubenswrapper[4757]: E1216 13:35:28.772663 4757 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9f15431_d8cd_408d_8169_e06457cabccc.slice/crio-conmon-873f6a8ede2fa9237135bad178b8e1733170cab1fde491de8c0d92c1f749eb42.scope\": RecentStats: unable to find data in memory cache]" Dec 16 13:35:34 crc kubenswrapper[4757]: I1216 13:35:34.991483 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:35:34 crc kubenswrapper[4757]: E1216 13:35:34.993068 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:35:39 crc kubenswrapper[4757]: E1216 13:35:39.037752 4757 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9f15431_d8cd_408d_8169_e06457cabccc.slice/crio-conmon-873f6a8ede2fa9237135bad178b8e1733170cab1fde491de8c0d92c1f749eb42.scope\": RecentStats: unable to find data in memory cache]" Dec 16 13:35:44 crc kubenswrapper[4757]: I1216 13:35:44.126961 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-84c894d85c-x59lp" Dec 16 13:35:49 crc kubenswrapper[4757]: E1216 13:35:49.307676 4757 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9f15431_d8cd_408d_8169_e06457cabccc.slice/crio-conmon-873f6a8ede2fa9237135bad178b8e1733170cab1fde491de8c0d92c1f749eb42.scope\": RecentStats: unable to find data in memory cache]" Dec 16 13:35:49 crc kubenswrapper[4757]: I1216 13:35:49.949468 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:35:49 crc kubenswrapper[4757]: E1216 13:35:49.950215 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:36:03 crc kubenswrapper[4757]: I1216 13:36:03.949557 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:36:03 crc kubenswrapper[4757]: E1216 13:36:03.950340 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:36:17 crc kubenswrapper[4757]: I1216 13:36:17.949446 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:36:17 crc kubenswrapper[4757]: E1216 13:36:17.950364 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:36:28 crc kubenswrapper[4757]: I1216 13:36:28.950318 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:36:28 crc kubenswrapper[4757]: E1216 13:36:28.953392 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:36:41 crc kubenswrapper[4757]: I1216 13:36:41.949931 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:36:41 crc kubenswrapper[4757]: E1216 13:36:41.950653 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:36:56 crc kubenswrapper[4757]: I1216 13:36:56.949606 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:36:56 crc kubenswrapper[4757]: E1216 13:36:56.950240 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:37:10 crc kubenswrapper[4757]: I1216 13:37:10.951347 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:37:10 crc kubenswrapper[4757]: E1216 13:37:10.952265 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:37:24 crc kubenswrapper[4757]: I1216 13:37:24.956191 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:37:24 crc kubenswrapper[4757]: E1216 13:37:24.957032 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:37:38 crc kubenswrapper[4757]: I1216 13:37:38.950799 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:37:38 crc kubenswrapper[4757]: E1216 13:37:38.951792 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:37:49 crc kubenswrapper[4757]: I1216 13:37:49.949339 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:37:49 crc kubenswrapper[4757]: E1216 13:37:49.950231 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:38:02 crc kubenswrapper[4757]: I1216 13:38:02.949944 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:38:02 crc kubenswrapper[4757]: E1216 13:38:02.950895 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:38:04 crc kubenswrapper[4757]: I1216 13:38:04.845912 4757 generic.go:334] "Generic (PLEG): container finished" podID="13b0d4c7-5eab-400a-9513-9391342fffee" containerID="a89ecd7bc1cd7939fe167012e4296c920ab9110c143d3e6db32f782e7a07ae88" exitCode=0 Dec 16 13:38:04 crc kubenswrapper[4757]: I1216 13:38:04.845973 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" event={"ID":"13b0d4c7-5eab-400a-9513-9391342fffee","Type":"ContainerDied","Data":"a89ecd7bc1cd7939fe167012e4296c920ab9110c143d3e6db32f782e7a07ae88"} Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.290847 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.479383 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-ssh-key\") pod \"13b0d4c7-5eab-400a-9513-9391342fffee\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.480371 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/13b0d4c7-5eab-400a-9513-9391342fffee-nova-extra-config-0\") pod \"13b0d4c7-5eab-400a-9513-9391342fffee\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.480423 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-cell1-compute-config-0\") pod \"13b0d4c7-5eab-400a-9513-9391342fffee\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.480449 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv2m5\" (UniqueName: \"kubernetes.io/projected/13b0d4c7-5eab-400a-9513-9391342fffee-kube-api-access-bv2m5\") pod \"13b0d4c7-5eab-400a-9513-9391342fffee\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.480490 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-cell1-compute-config-1\") pod \"13b0d4c7-5eab-400a-9513-9391342fffee\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.480565 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-migration-ssh-key-1\") pod \"13b0d4c7-5eab-400a-9513-9391342fffee\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.480592 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-migration-ssh-key-0\") pod \"13b0d4c7-5eab-400a-9513-9391342fffee\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.480611 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-inventory\") pod \"13b0d4c7-5eab-400a-9513-9391342fffee\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.480651 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-combined-ca-bundle\") pod \"13b0d4c7-5eab-400a-9513-9391342fffee\" (UID: \"13b0d4c7-5eab-400a-9513-9391342fffee\") " Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.485419 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "13b0d4c7-5eab-400a-9513-9391342fffee" (UID: "13b0d4c7-5eab-400a-9513-9391342fffee"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.487145 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13b0d4c7-5eab-400a-9513-9391342fffee-kube-api-access-bv2m5" (OuterVolumeSpecName: "kube-api-access-bv2m5") pod "13b0d4c7-5eab-400a-9513-9391342fffee" (UID: "13b0d4c7-5eab-400a-9513-9391342fffee"). InnerVolumeSpecName "kube-api-access-bv2m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.507132 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-inventory" (OuterVolumeSpecName: "inventory") pod "13b0d4c7-5eab-400a-9513-9391342fffee" (UID: "13b0d4c7-5eab-400a-9513-9391342fffee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.513962 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "13b0d4c7-5eab-400a-9513-9391342fffee" (UID: "13b0d4c7-5eab-400a-9513-9391342fffee"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.515109 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "13b0d4c7-5eab-400a-9513-9391342fffee" (UID: "13b0d4c7-5eab-400a-9513-9391342fffee"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.515688 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "13b0d4c7-5eab-400a-9513-9391342fffee" (UID: "13b0d4c7-5eab-400a-9513-9391342fffee"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.526642 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "13b0d4c7-5eab-400a-9513-9391342fffee" (UID: "13b0d4c7-5eab-400a-9513-9391342fffee"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.528938 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "13b0d4c7-5eab-400a-9513-9391342fffee" (UID: "13b0d4c7-5eab-400a-9513-9391342fffee"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.536764 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13b0d4c7-5eab-400a-9513-9391342fffee-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "13b0d4c7-5eab-400a-9513-9391342fffee" (UID: "13b0d4c7-5eab-400a-9513-9391342fffee"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.583206 4757 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.583247 4757 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/13b0d4c7-5eab-400a-9513-9391342fffee-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.583264 4757 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.583275 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv2m5\" (UniqueName: \"kubernetes.io/projected/13b0d4c7-5eab-400a-9513-9391342fffee-kube-api-access-bv2m5\") on node \"crc\" DevicePath \"\"" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.583287 4757 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.583297 4757 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.583307 4757 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.583317 4757 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.583327 4757 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13b0d4c7-5eab-400a-9513-9391342fffee-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.865859 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" event={"ID":"13b0d4c7-5eab-400a-9513-9391342fffee","Type":"ContainerDied","Data":"20efffac4799d0a2f944f95a039ff006ce8bbf18839798cdb7055533e9ed6a8b"} Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.866203 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20efffac4799d0a2f944f95a039ff006ce8bbf18839798cdb7055533e9ed6a8b" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.866208 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-8dcjd" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.991658 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn"] Dec 16 13:38:06 crc kubenswrapper[4757]: E1216 13:38:06.992129 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daba874e-e472-48a3-b4f5-dce301d1438c" containerName="registry-server" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.992150 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="daba874e-e472-48a3-b4f5-dce301d1438c" containerName="registry-server" Dec 16 13:38:06 crc kubenswrapper[4757]: E1216 13:38:06.992166 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daba874e-e472-48a3-b4f5-dce301d1438c" containerName="extract-utilities" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.992173 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="daba874e-e472-48a3-b4f5-dce301d1438c" containerName="extract-utilities" Dec 16 13:38:06 crc kubenswrapper[4757]: E1216 13:38:06.992190 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15380341-a6d5-4aa4-8447-05b61eaa936b" containerName="extract-content" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.992195 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="15380341-a6d5-4aa4-8447-05b61eaa936b" containerName="extract-content" Dec 16 13:38:06 crc kubenswrapper[4757]: E1216 13:38:06.992213 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b0d4c7-5eab-400a-9513-9391342fffee" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.992219 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b0d4c7-5eab-400a-9513-9391342fffee" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 16 13:38:06 crc kubenswrapper[4757]: E1216 13:38:06.992228 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15380341-a6d5-4aa4-8447-05b61eaa936b" containerName="registry-server" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.992235 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="15380341-a6d5-4aa4-8447-05b61eaa936b" containerName="registry-server" Dec 16 13:38:06 crc kubenswrapper[4757]: E1216 13:38:06.992249 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15380341-a6d5-4aa4-8447-05b61eaa936b" containerName="extract-utilities" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.992256 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="15380341-a6d5-4aa4-8447-05b61eaa936b" containerName="extract-utilities" Dec 16 13:38:06 crc kubenswrapper[4757]: E1216 13:38:06.992267 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daba874e-e472-48a3-b4f5-dce301d1438c" containerName="extract-content" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.992274 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="daba874e-e472-48a3-b4f5-dce301d1438c" containerName="extract-content" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.992489 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b0d4c7-5eab-400a-9513-9391342fffee" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.992518 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="daba874e-e472-48a3-b4f5-dce301d1438c" containerName="registry-server" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.992536 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="15380341-a6d5-4aa4-8447-05b61eaa936b" containerName="registry-server" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.993260 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:06 crc kubenswrapper[4757]: I1216 13:38:06.999715 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:06.999860 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.000257 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.000925 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.001180 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.007880 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn"] Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.092504 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.092611 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.092672 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.092798 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.094161 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4ldw\" (UniqueName: \"kubernetes.io/projected/eb89db22-f667-4563-9468-97cd48c1da89-kube-api-access-w4ldw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.094363 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.094595 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.196778 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.196880 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4ldw\" (UniqueName: \"kubernetes.io/projected/eb89db22-f667-4563-9468-97cd48c1da89-kube-api-access-w4ldw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.196932 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.196966 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.197061 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.197111 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.197140 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.202160 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.202759 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.203070 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.203586 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.204155 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.209546 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.220602 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4ldw\" (UniqueName: \"kubernetes.io/projected/eb89db22-f667-4563-9468-97cd48c1da89-kube-api-access-w4ldw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.315821 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-58lpf" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.324070 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.966318 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn"] Dec 16 13:38:07 crc kubenswrapper[4757]: W1216 13:38:07.970750 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb89db22_f667_4563_9468_97cd48c1da89.slice/crio-0d6b9496eb1e9249bab3cda9aaa4c5f4094ad04ff2bf8c1c3b40ec97d55de54d WatchSource:0}: Error finding container 0d6b9496eb1e9249bab3cda9aaa4c5f4094ad04ff2bf8c1c3b40ec97d55de54d: Status 404 returned error can't find the container with id 0d6b9496eb1e9249bab3cda9aaa4c5f4094ad04ff2bf8c1c3b40ec97d55de54d Dec 16 13:38:07 crc kubenswrapper[4757]: I1216 13:38:07.973962 4757 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 13:38:08 crc kubenswrapper[4757]: I1216 13:38:08.723232 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 13:38:08 crc kubenswrapper[4757]: I1216 13:38:08.888624 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" event={"ID":"eb89db22-f667-4563-9468-97cd48c1da89","Type":"ContainerStarted","Data":"0d6b9496eb1e9249bab3cda9aaa4c5f4094ad04ff2bf8c1c3b40ec97d55de54d"} Dec 16 13:38:09 crc kubenswrapper[4757]: I1216 13:38:09.897742 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" event={"ID":"eb89db22-f667-4563-9468-97cd48c1da89","Type":"ContainerStarted","Data":"c20c6f5de21faf812ddef3f1f4d8d8ab3ade9c33bd45a49802c501640b064517"} Dec 16 13:38:09 crc kubenswrapper[4757]: I1216 13:38:09.928621 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" podStartSLOduration=3.182668698 podStartE2EDuration="3.928597179s" podCreationTimestamp="2025-12-16 13:38:06 +0000 UTC" firstStartedPulling="2025-12-16 13:38:07.973782053 +0000 UTC m=+3073.401525849" lastFinishedPulling="2025-12-16 13:38:08.719710524 +0000 UTC m=+3074.147454330" observedRunningTime="2025-12-16 13:38:09.916702073 +0000 UTC m=+3075.344445879" watchObservedRunningTime="2025-12-16 13:38:09.928597179 +0000 UTC m=+3075.356340985" Dec 16 13:38:16 crc kubenswrapper[4757]: I1216 13:38:16.949804 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:38:16 crc kubenswrapper[4757]: E1216 13:38:16.951088 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:38:28 crc kubenswrapper[4757]: I1216 13:38:28.949617 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:38:28 crc kubenswrapper[4757]: E1216 13:38:28.950479 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:38:42 crc kubenswrapper[4757]: I1216 13:38:42.948726 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:38:42 crc kubenswrapper[4757]: E1216 13:38:42.949641 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:38:54 crc kubenswrapper[4757]: I1216 13:38:54.957543 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:38:54 crc kubenswrapper[4757]: E1216 13:38:54.958629 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:39:08 crc kubenswrapper[4757]: I1216 13:39:08.949917 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:39:08 crc kubenswrapper[4757]: E1216 13:39:08.952268 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:39:23 crc kubenswrapper[4757]: I1216 13:39:23.949301 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:39:23 crc kubenswrapper[4757]: E1216 13:39:23.950246 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:39:35 crc kubenswrapper[4757]: I1216 13:39:35.948589 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:39:35 crc kubenswrapper[4757]: E1216 13:39:35.949529 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:39:50 crc kubenswrapper[4757]: I1216 13:39:50.949134 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:39:50 crc kubenswrapper[4757]: E1216 13:39:50.949923 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:40:02 crc kubenswrapper[4757]: I1216 13:40:02.949535 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:40:02 crc kubenswrapper[4757]: E1216 13:40:02.950566 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:40:13 crc kubenswrapper[4757]: I1216 13:40:13.949473 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:40:15 crc kubenswrapper[4757]: I1216 13:40:15.049037 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"6ba36603b2ede44048e78f416348abfebbffb6f1388fd687a5fe481e6d72c629"} Dec 16 13:41:05 crc kubenswrapper[4757]: I1216 13:41:05.694412 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8pgpm"] Dec 16 13:41:05 crc kubenswrapper[4757]: I1216 13:41:05.697350 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pgpm" Dec 16 13:41:05 crc kubenswrapper[4757]: I1216 13:41:05.706430 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8pgpm"] Dec 16 13:41:05 crc kubenswrapper[4757]: I1216 13:41:05.838357 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1de2ba0-68bd-4f59-8088-04213dcedb4c-catalog-content\") pod \"certified-operators-8pgpm\" (UID: \"c1de2ba0-68bd-4f59-8088-04213dcedb4c\") " pod="openshift-marketplace/certified-operators-8pgpm" Dec 16 13:41:05 crc kubenswrapper[4757]: I1216 13:41:05.838485 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-762zx\" (UniqueName: \"kubernetes.io/projected/c1de2ba0-68bd-4f59-8088-04213dcedb4c-kube-api-access-762zx\") pod \"certified-operators-8pgpm\" (UID: \"c1de2ba0-68bd-4f59-8088-04213dcedb4c\") " pod="openshift-marketplace/certified-operators-8pgpm" Dec 16 13:41:05 crc kubenswrapper[4757]: I1216 13:41:05.838737 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1de2ba0-68bd-4f59-8088-04213dcedb4c-utilities\") pod \"certified-operators-8pgpm\" (UID: \"c1de2ba0-68bd-4f59-8088-04213dcedb4c\") " pod="openshift-marketplace/certified-operators-8pgpm" Dec 16 13:41:05 crc kubenswrapper[4757]: I1216 13:41:05.940309 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-762zx\" (UniqueName: \"kubernetes.io/projected/c1de2ba0-68bd-4f59-8088-04213dcedb4c-kube-api-access-762zx\") pod \"certified-operators-8pgpm\" (UID: \"c1de2ba0-68bd-4f59-8088-04213dcedb4c\") " pod="openshift-marketplace/certified-operators-8pgpm" Dec 16 13:41:05 crc kubenswrapper[4757]: I1216 13:41:05.940534 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1de2ba0-68bd-4f59-8088-04213dcedb4c-utilities\") pod \"certified-operators-8pgpm\" (UID: \"c1de2ba0-68bd-4f59-8088-04213dcedb4c\") " pod="openshift-marketplace/certified-operators-8pgpm" Dec 16 13:41:05 crc kubenswrapper[4757]: I1216 13:41:05.940577 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1de2ba0-68bd-4f59-8088-04213dcedb4c-catalog-content\") pod \"certified-operators-8pgpm\" (UID: \"c1de2ba0-68bd-4f59-8088-04213dcedb4c\") " pod="openshift-marketplace/certified-operators-8pgpm" Dec 16 13:41:05 crc kubenswrapper[4757]: I1216 13:41:05.941287 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1de2ba0-68bd-4f59-8088-04213dcedb4c-utilities\") pod \"certified-operators-8pgpm\" (UID: \"c1de2ba0-68bd-4f59-8088-04213dcedb4c\") " pod="openshift-marketplace/certified-operators-8pgpm" Dec 16 13:41:05 crc kubenswrapper[4757]: I1216 13:41:05.941303 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1de2ba0-68bd-4f59-8088-04213dcedb4c-catalog-content\") pod \"certified-operators-8pgpm\" (UID: \"c1de2ba0-68bd-4f59-8088-04213dcedb4c\") " pod="openshift-marketplace/certified-operators-8pgpm" Dec 16 13:41:05 crc kubenswrapper[4757]: I1216 13:41:05.966480 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-762zx\" (UniqueName: \"kubernetes.io/projected/c1de2ba0-68bd-4f59-8088-04213dcedb4c-kube-api-access-762zx\") pod \"certified-operators-8pgpm\" (UID: \"c1de2ba0-68bd-4f59-8088-04213dcedb4c\") " pod="openshift-marketplace/certified-operators-8pgpm" Dec 16 13:41:06 crc kubenswrapper[4757]: I1216 13:41:06.015832 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pgpm" Dec 16 13:41:06 crc kubenswrapper[4757]: W1216 13:41:06.634253 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1de2ba0_68bd_4f59_8088_04213dcedb4c.slice/crio-abdd3bdeee477ee23114bb6cc31f2298b2c2a993fb2f5ef30d14fe18af22691a WatchSource:0}: Error finding container abdd3bdeee477ee23114bb6cc31f2298b2c2a993fb2f5ef30d14fe18af22691a: Status 404 returned error can't find the container with id abdd3bdeee477ee23114bb6cc31f2298b2c2a993fb2f5ef30d14fe18af22691a Dec 16 13:41:06 crc kubenswrapper[4757]: I1216 13:41:06.636565 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8pgpm"] Dec 16 13:41:07 crc kubenswrapper[4757]: I1216 13:41:07.491318 4757 generic.go:334] "Generic (PLEG): container finished" podID="c1de2ba0-68bd-4f59-8088-04213dcedb4c" containerID="21c04ff4107d11c314281a2bfce466e94b34cc1a68b366e31c48e8348dd760c1" exitCode=0 Dec 16 13:41:07 crc kubenswrapper[4757]: I1216 13:41:07.491463 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pgpm" event={"ID":"c1de2ba0-68bd-4f59-8088-04213dcedb4c","Type":"ContainerDied","Data":"21c04ff4107d11c314281a2bfce466e94b34cc1a68b366e31c48e8348dd760c1"} Dec 16 13:41:07 crc kubenswrapper[4757]: I1216 13:41:07.491678 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pgpm" event={"ID":"c1de2ba0-68bd-4f59-8088-04213dcedb4c","Type":"ContainerStarted","Data":"abdd3bdeee477ee23114bb6cc31f2298b2c2a993fb2f5ef30d14fe18af22691a"} Dec 16 13:41:08 crc kubenswrapper[4757]: I1216 13:41:08.502096 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pgpm" event={"ID":"c1de2ba0-68bd-4f59-8088-04213dcedb4c","Type":"ContainerStarted","Data":"9dad7d68879e2a2cc3d57ae2f4a13a771693fdaf2956a01c011780ec3a0719b1"} Dec 16 13:41:10 crc kubenswrapper[4757]: I1216 13:41:10.520814 4757 generic.go:334] "Generic (PLEG): container finished" podID="c1de2ba0-68bd-4f59-8088-04213dcedb4c" containerID="9dad7d68879e2a2cc3d57ae2f4a13a771693fdaf2956a01c011780ec3a0719b1" exitCode=0 Dec 16 13:41:10 crc kubenswrapper[4757]: I1216 13:41:10.520906 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pgpm" event={"ID":"c1de2ba0-68bd-4f59-8088-04213dcedb4c","Type":"ContainerDied","Data":"9dad7d68879e2a2cc3d57ae2f4a13a771693fdaf2956a01c011780ec3a0719b1"} Dec 16 13:41:11 crc kubenswrapper[4757]: I1216 13:41:11.533460 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pgpm" event={"ID":"c1de2ba0-68bd-4f59-8088-04213dcedb4c","Type":"ContainerStarted","Data":"f20bbc68bf7537764b9b451f2a064ff4809d224ffea970b0a7aaf89526b4831b"} Dec 16 13:41:11 crc kubenswrapper[4757]: I1216 13:41:11.560037 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8pgpm" podStartSLOduration=2.963141325 podStartE2EDuration="6.560017297s" podCreationTimestamp="2025-12-16 13:41:05 +0000 UTC" firstStartedPulling="2025-12-16 13:41:07.493039323 +0000 UTC m=+3252.920783129" lastFinishedPulling="2025-12-16 13:41:11.089915305 +0000 UTC m=+3256.517659101" observedRunningTime="2025-12-16 13:41:11.550423798 +0000 UTC m=+3256.978167614" watchObservedRunningTime="2025-12-16 13:41:11.560017297 +0000 UTC m=+3256.987761093" Dec 16 13:41:16 crc kubenswrapper[4757]: I1216 13:41:16.015886 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8pgpm" Dec 16 13:41:16 crc kubenswrapper[4757]: I1216 13:41:16.016515 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8pgpm" Dec 16 13:41:16 crc kubenswrapper[4757]: I1216 13:41:16.069406 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8pgpm" Dec 16 13:41:16 crc kubenswrapper[4757]: I1216 13:41:16.657043 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8pgpm" Dec 16 13:41:18 crc kubenswrapper[4757]: I1216 13:41:18.417949 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8pgpm"] Dec 16 13:41:18 crc kubenswrapper[4757]: I1216 13:41:18.610573 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8pgpm" podUID="c1de2ba0-68bd-4f59-8088-04213dcedb4c" containerName="registry-server" containerID="cri-o://f20bbc68bf7537764b9b451f2a064ff4809d224ffea970b0a7aaf89526b4831b" gracePeriod=2 Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.074945 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pgpm" Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.196982 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1de2ba0-68bd-4f59-8088-04213dcedb4c-catalog-content\") pod \"c1de2ba0-68bd-4f59-8088-04213dcedb4c\" (UID: \"c1de2ba0-68bd-4f59-8088-04213dcedb4c\") " Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.197168 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1de2ba0-68bd-4f59-8088-04213dcedb4c-utilities\") pod \"c1de2ba0-68bd-4f59-8088-04213dcedb4c\" (UID: \"c1de2ba0-68bd-4f59-8088-04213dcedb4c\") " Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.197250 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-762zx\" (UniqueName: \"kubernetes.io/projected/c1de2ba0-68bd-4f59-8088-04213dcedb4c-kube-api-access-762zx\") pod \"c1de2ba0-68bd-4f59-8088-04213dcedb4c\" (UID: \"c1de2ba0-68bd-4f59-8088-04213dcedb4c\") " Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.198412 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1de2ba0-68bd-4f59-8088-04213dcedb4c-utilities" (OuterVolumeSpecName: "utilities") pod "c1de2ba0-68bd-4f59-8088-04213dcedb4c" (UID: "c1de2ba0-68bd-4f59-8088-04213dcedb4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.207445 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1de2ba0-68bd-4f59-8088-04213dcedb4c-kube-api-access-762zx" (OuterVolumeSpecName: "kube-api-access-762zx") pod "c1de2ba0-68bd-4f59-8088-04213dcedb4c" (UID: "c1de2ba0-68bd-4f59-8088-04213dcedb4c"). InnerVolumeSpecName "kube-api-access-762zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.261809 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1de2ba0-68bd-4f59-8088-04213dcedb4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1de2ba0-68bd-4f59-8088-04213dcedb4c" (UID: "c1de2ba0-68bd-4f59-8088-04213dcedb4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.299155 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-762zx\" (UniqueName: \"kubernetes.io/projected/c1de2ba0-68bd-4f59-8088-04213dcedb4c-kube-api-access-762zx\") on node \"crc\" DevicePath \"\"" Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.299191 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1de2ba0-68bd-4f59-8088-04213dcedb4c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.299203 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1de2ba0-68bd-4f59-8088-04213dcedb4c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.630168 4757 generic.go:334] "Generic (PLEG): container finished" podID="c1de2ba0-68bd-4f59-8088-04213dcedb4c" containerID="f20bbc68bf7537764b9b451f2a064ff4809d224ffea970b0a7aaf89526b4831b" exitCode=0 Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.630227 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pgpm" event={"ID":"c1de2ba0-68bd-4f59-8088-04213dcedb4c","Type":"ContainerDied","Data":"f20bbc68bf7537764b9b451f2a064ff4809d224ffea970b0a7aaf89526b4831b"} Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.630254 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pgpm" Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.630280 4757 scope.go:117] "RemoveContainer" containerID="f20bbc68bf7537764b9b451f2a064ff4809d224ffea970b0a7aaf89526b4831b" Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.630262 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pgpm" event={"ID":"c1de2ba0-68bd-4f59-8088-04213dcedb4c","Type":"ContainerDied","Data":"abdd3bdeee477ee23114bb6cc31f2298b2c2a993fb2f5ef30d14fe18af22691a"} Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.652718 4757 scope.go:117] "RemoveContainer" containerID="9dad7d68879e2a2cc3d57ae2f4a13a771693fdaf2956a01c011780ec3a0719b1" Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.681423 4757 scope.go:117] "RemoveContainer" containerID="21c04ff4107d11c314281a2bfce466e94b34cc1a68b366e31c48e8348dd760c1" Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.682434 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8pgpm"] Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.691020 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8pgpm"] Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.738831 4757 scope.go:117] "RemoveContainer" containerID="f20bbc68bf7537764b9b451f2a064ff4809d224ffea970b0a7aaf89526b4831b" Dec 16 13:41:19 crc kubenswrapper[4757]: E1216 13:41:19.739307 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f20bbc68bf7537764b9b451f2a064ff4809d224ffea970b0a7aaf89526b4831b\": container with ID starting with f20bbc68bf7537764b9b451f2a064ff4809d224ffea970b0a7aaf89526b4831b not found: ID does not exist" containerID="f20bbc68bf7537764b9b451f2a064ff4809d224ffea970b0a7aaf89526b4831b" Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.739342 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f20bbc68bf7537764b9b451f2a064ff4809d224ffea970b0a7aaf89526b4831b"} err="failed to get container status \"f20bbc68bf7537764b9b451f2a064ff4809d224ffea970b0a7aaf89526b4831b\": rpc error: code = NotFound desc = could not find container \"f20bbc68bf7537764b9b451f2a064ff4809d224ffea970b0a7aaf89526b4831b\": container with ID starting with f20bbc68bf7537764b9b451f2a064ff4809d224ffea970b0a7aaf89526b4831b not found: ID does not exist" Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.739371 4757 scope.go:117] "RemoveContainer" containerID="9dad7d68879e2a2cc3d57ae2f4a13a771693fdaf2956a01c011780ec3a0719b1" Dec 16 13:41:19 crc kubenswrapper[4757]: E1216 13:41:19.739811 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dad7d68879e2a2cc3d57ae2f4a13a771693fdaf2956a01c011780ec3a0719b1\": container with ID starting with 9dad7d68879e2a2cc3d57ae2f4a13a771693fdaf2956a01c011780ec3a0719b1 not found: ID does not exist" containerID="9dad7d68879e2a2cc3d57ae2f4a13a771693fdaf2956a01c011780ec3a0719b1" Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.739841 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dad7d68879e2a2cc3d57ae2f4a13a771693fdaf2956a01c011780ec3a0719b1"} err="failed to get container status \"9dad7d68879e2a2cc3d57ae2f4a13a771693fdaf2956a01c011780ec3a0719b1\": rpc error: code = NotFound desc = could not find container \"9dad7d68879e2a2cc3d57ae2f4a13a771693fdaf2956a01c011780ec3a0719b1\": container with ID starting with 9dad7d68879e2a2cc3d57ae2f4a13a771693fdaf2956a01c011780ec3a0719b1 not found: ID does not exist" Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.739859 4757 scope.go:117] "RemoveContainer" containerID="21c04ff4107d11c314281a2bfce466e94b34cc1a68b366e31c48e8348dd760c1" Dec 16 13:41:19 crc kubenswrapper[4757]: E1216 13:41:19.740240 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21c04ff4107d11c314281a2bfce466e94b34cc1a68b366e31c48e8348dd760c1\": container with ID starting with 21c04ff4107d11c314281a2bfce466e94b34cc1a68b366e31c48e8348dd760c1 not found: ID does not exist" containerID="21c04ff4107d11c314281a2bfce466e94b34cc1a68b366e31c48e8348dd760c1" Dec 16 13:41:19 crc kubenswrapper[4757]: I1216 13:41:19.740265 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c04ff4107d11c314281a2bfce466e94b34cc1a68b366e31c48e8348dd760c1"} err="failed to get container status \"21c04ff4107d11c314281a2bfce466e94b34cc1a68b366e31c48e8348dd760c1\": rpc error: code = NotFound desc = could not find container \"21c04ff4107d11c314281a2bfce466e94b34cc1a68b366e31c48e8348dd760c1\": container with ID starting with 21c04ff4107d11c314281a2bfce466e94b34cc1a68b366e31c48e8348dd760c1 not found: ID does not exist" Dec 16 13:41:20 crc kubenswrapper[4757]: I1216 13:41:20.960092 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1de2ba0-68bd-4f59-8088-04213dcedb4c" path="/var/lib/kubelet/pods/c1de2ba0-68bd-4f59-8088-04213dcedb4c/volumes" Dec 16 13:42:04 crc kubenswrapper[4757]: I1216 13:42:04.571626 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8djct"] Dec 16 13:42:04 crc kubenswrapper[4757]: E1216 13:42:04.577217 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1de2ba0-68bd-4f59-8088-04213dcedb4c" containerName="extract-content" Dec 16 13:42:04 crc kubenswrapper[4757]: I1216 13:42:04.577345 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1de2ba0-68bd-4f59-8088-04213dcedb4c" containerName="extract-content" Dec 16 13:42:04 crc kubenswrapper[4757]: E1216 13:42:04.577468 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1de2ba0-68bd-4f59-8088-04213dcedb4c" containerName="registry-server" Dec 16 13:42:04 crc kubenswrapper[4757]: I1216 13:42:04.577554 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1de2ba0-68bd-4f59-8088-04213dcedb4c" containerName="registry-server" Dec 16 13:42:04 crc kubenswrapper[4757]: E1216 13:42:04.577667 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1de2ba0-68bd-4f59-8088-04213dcedb4c" containerName="extract-utilities" Dec 16 13:42:04 crc kubenswrapper[4757]: I1216 13:42:04.577763 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1de2ba0-68bd-4f59-8088-04213dcedb4c" containerName="extract-utilities" Dec 16 13:42:04 crc kubenswrapper[4757]: I1216 13:42:04.578163 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1de2ba0-68bd-4f59-8088-04213dcedb4c" containerName="registry-server" Dec 16 13:42:04 crc kubenswrapper[4757]: I1216 13:42:04.580199 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8djct" Dec 16 13:42:04 crc kubenswrapper[4757]: I1216 13:42:04.595529 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8djct"] Dec 16 13:42:04 crc kubenswrapper[4757]: I1216 13:42:04.710878 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e45bdb9-422b-4864-b3fc-4aeab70108a3-catalog-content\") pod \"redhat-operators-8djct\" (UID: \"4e45bdb9-422b-4864-b3fc-4aeab70108a3\") " pod="openshift-marketplace/redhat-operators-8djct" Dec 16 13:42:04 crc kubenswrapper[4757]: I1216 13:42:04.710985 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e45bdb9-422b-4864-b3fc-4aeab70108a3-utilities\") pod \"redhat-operators-8djct\" (UID: \"4e45bdb9-422b-4864-b3fc-4aeab70108a3\") " pod="openshift-marketplace/redhat-operators-8djct" Dec 16 13:42:04 crc kubenswrapper[4757]: I1216 13:42:04.711053 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsttx\" (UniqueName: \"kubernetes.io/projected/4e45bdb9-422b-4864-b3fc-4aeab70108a3-kube-api-access-hsttx\") pod \"redhat-operators-8djct\" (UID: \"4e45bdb9-422b-4864-b3fc-4aeab70108a3\") " pod="openshift-marketplace/redhat-operators-8djct" Dec 16 13:42:04 crc kubenswrapper[4757]: I1216 13:42:04.812151 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e45bdb9-422b-4864-b3fc-4aeab70108a3-utilities\") pod \"redhat-operators-8djct\" (UID: \"4e45bdb9-422b-4864-b3fc-4aeab70108a3\") " pod="openshift-marketplace/redhat-operators-8djct" Dec 16 13:42:04 crc kubenswrapper[4757]: I1216 13:42:04.812218 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsttx\" (UniqueName: \"kubernetes.io/projected/4e45bdb9-422b-4864-b3fc-4aeab70108a3-kube-api-access-hsttx\") pod \"redhat-operators-8djct\" (UID: \"4e45bdb9-422b-4864-b3fc-4aeab70108a3\") " pod="openshift-marketplace/redhat-operators-8djct" Dec 16 13:42:04 crc kubenswrapper[4757]: I1216 13:42:04.812324 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e45bdb9-422b-4864-b3fc-4aeab70108a3-catalog-content\") pod \"redhat-operators-8djct\" (UID: \"4e45bdb9-422b-4864-b3fc-4aeab70108a3\") " pod="openshift-marketplace/redhat-operators-8djct" Dec 16 13:42:04 crc kubenswrapper[4757]: I1216 13:42:04.812735 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e45bdb9-422b-4864-b3fc-4aeab70108a3-catalog-content\") pod \"redhat-operators-8djct\" (UID: \"4e45bdb9-422b-4864-b3fc-4aeab70108a3\") " pod="openshift-marketplace/redhat-operators-8djct" Dec 16 13:42:04 crc kubenswrapper[4757]: I1216 13:42:04.812747 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e45bdb9-422b-4864-b3fc-4aeab70108a3-utilities\") pod \"redhat-operators-8djct\" (UID: \"4e45bdb9-422b-4864-b3fc-4aeab70108a3\") " pod="openshift-marketplace/redhat-operators-8djct" Dec 16 13:42:04 crc kubenswrapper[4757]: I1216 13:42:04.844553 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsttx\" (UniqueName: \"kubernetes.io/projected/4e45bdb9-422b-4864-b3fc-4aeab70108a3-kube-api-access-hsttx\") pod \"redhat-operators-8djct\" (UID: \"4e45bdb9-422b-4864-b3fc-4aeab70108a3\") " pod="openshift-marketplace/redhat-operators-8djct" Dec 16 13:42:04 crc kubenswrapper[4757]: I1216 13:42:04.908147 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8djct" Dec 16 13:42:05 crc kubenswrapper[4757]: I1216 13:42:05.437295 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8djct"] Dec 16 13:42:06 crc kubenswrapper[4757]: I1216 13:42:06.071359 4757 generic.go:334] "Generic (PLEG): container finished" podID="4e45bdb9-422b-4864-b3fc-4aeab70108a3" containerID="e0f02c80d28e63a4292cdce60b1794d00132b31d918d2147a0fc03c12bbda621" exitCode=0 Dec 16 13:42:06 crc kubenswrapper[4757]: I1216 13:42:06.071457 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8djct" event={"ID":"4e45bdb9-422b-4864-b3fc-4aeab70108a3","Type":"ContainerDied","Data":"e0f02c80d28e63a4292cdce60b1794d00132b31d918d2147a0fc03c12bbda621"} Dec 16 13:42:06 crc kubenswrapper[4757]: I1216 13:42:06.071913 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8djct" event={"ID":"4e45bdb9-422b-4864-b3fc-4aeab70108a3","Type":"ContainerStarted","Data":"28e5762a4c169f3e265b96115af90f868d71f99487fd794ba374cb51251a3ecb"} Dec 16 13:42:13 crc kubenswrapper[4757]: I1216 13:42:13.154661 4757 generic.go:334] "Generic (PLEG): container finished" podID="eb89db22-f667-4563-9468-97cd48c1da89" containerID="c20c6f5de21faf812ddef3f1f4d8d8ab3ade9c33bd45a49802c501640b064517" exitCode=0 Dec 16 13:42:13 crc kubenswrapper[4757]: I1216 13:42:13.154847 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" event={"ID":"eb89db22-f667-4563-9468-97cd48c1da89","Type":"ContainerDied","Data":"c20c6f5de21faf812ddef3f1f4d8d8ab3ade9c33bd45a49802c501640b064517"} Dec 16 13:42:16 crc kubenswrapper[4757]: I1216 13:42:16.690963 4757 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 13:42:16 crc kubenswrapper[4757]: I1216 13:42:16.691633 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.540678 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.654772 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-telemetry-combined-ca-bundle\") pod \"eb89db22-f667-4563-9468-97cd48c1da89\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.655078 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ssh-key\") pod \"eb89db22-f667-4563-9468-97cd48c1da89\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.655120 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ceilometer-compute-config-data-0\") pod \"eb89db22-f667-4563-9468-97cd48c1da89\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.655153 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-inventory\") pod \"eb89db22-f667-4563-9468-97cd48c1da89\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.655279 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ceilometer-compute-config-data-1\") pod \"eb89db22-f667-4563-9468-97cd48c1da89\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.655349 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4ldw\" (UniqueName: \"kubernetes.io/projected/eb89db22-f667-4563-9468-97cd48c1da89-kube-api-access-w4ldw\") pod \"eb89db22-f667-4563-9468-97cd48c1da89\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.655383 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ceilometer-compute-config-data-2\") pod \"eb89db22-f667-4563-9468-97cd48c1da89\" (UID: \"eb89db22-f667-4563-9468-97cd48c1da89\") " Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.673285 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "eb89db22-f667-4563-9468-97cd48c1da89" (UID: "eb89db22-f667-4563-9468-97cd48c1da89"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.673386 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb89db22-f667-4563-9468-97cd48c1da89-kube-api-access-w4ldw" (OuterVolumeSpecName: "kube-api-access-w4ldw") pod "eb89db22-f667-4563-9468-97cd48c1da89" (UID: "eb89db22-f667-4563-9468-97cd48c1da89"). InnerVolumeSpecName "kube-api-access-w4ldw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.685728 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "eb89db22-f667-4563-9468-97cd48c1da89" (UID: "eb89db22-f667-4563-9468-97cd48c1da89"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.694266 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "eb89db22-f667-4563-9468-97cd48c1da89" (UID: "eb89db22-f667-4563-9468-97cd48c1da89"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.748665 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "eb89db22-f667-4563-9468-97cd48c1da89" (UID: "eb89db22-f667-4563-9468-97cd48c1da89"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.749811 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-inventory" (OuterVolumeSpecName: "inventory") pod "eb89db22-f667-4563-9468-97cd48c1da89" (UID: "eb89db22-f667-4563-9468-97cd48c1da89"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.758262 4757 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.758502 4757 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.758631 4757 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.758719 4757 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.758802 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4ldw\" (UniqueName: \"kubernetes.io/projected/eb89db22-f667-4563-9468-97cd48c1da89-kube-api-access-w4ldw\") on node \"crc\" DevicePath \"\"" Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.759141 4757 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.770146 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "eb89db22-f667-4563-9468-97cd48c1da89" (UID: "eb89db22-f667-4563-9468-97cd48c1da89"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:42:19 crc kubenswrapper[4757]: I1216 13:42:19.862673 4757 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/eb89db22-f667-4563-9468-97cd48c1da89-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 16 13:42:20 crc kubenswrapper[4757]: I1216 13:42:20.214967 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8djct" event={"ID":"4e45bdb9-422b-4864-b3fc-4aeab70108a3","Type":"ContainerStarted","Data":"9c48d4968e86e4b5d1772bc14694512daadbd6bcb3deb38169716e485aab5d8b"} Dec 16 13:42:20 crc kubenswrapper[4757]: I1216 13:42:20.218019 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" event={"ID":"eb89db22-f667-4563-9468-97cd48c1da89","Type":"ContainerDied","Data":"0d6b9496eb1e9249bab3cda9aaa4c5f4094ad04ff2bf8c1c3b40ec97d55de54d"} Dec 16 13:42:20 crc kubenswrapper[4757]: I1216 13:42:20.218060 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d6b9496eb1e9249bab3cda9aaa4c5f4094ad04ff2bf8c1c3b40ec97d55de54d" Dec 16 13:42:20 crc kubenswrapper[4757]: I1216 13:42:20.218122 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn" Dec 16 13:42:21 crc kubenswrapper[4757]: I1216 13:42:21.181251 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:42:21 crc kubenswrapper[4757]: I1216 13:42:21.181463 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:42:23 crc kubenswrapper[4757]: I1216 13:42:23.247093 4757 generic.go:334] "Generic (PLEG): container finished" podID="4e45bdb9-422b-4864-b3fc-4aeab70108a3" containerID="9c48d4968e86e4b5d1772bc14694512daadbd6bcb3deb38169716e485aab5d8b" exitCode=0 Dec 16 13:42:23 crc kubenswrapper[4757]: I1216 13:42:23.247284 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8djct" event={"ID":"4e45bdb9-422b-4864-b3fc-4aeab70108a3","Type":"ContainerDied","Data":"9c48d4968e86e4b5d1772bc14694512daadbd6bcb3deb38169716e485aab5d8b"} Dec 16 13:42:25 crc kubenswrapper[4757]: I1216 13:42:25.272169 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8djct" event={"ID":"4e45bdb9-422b-4864-b3fc-4aeab70108a3","Type":"ContainerStarted","Data":"4c80ae95645d37a0aefe22aefb4c9805dd2f53866d95c54651bfb5245789a6bd"} Dec 16 13:42:25 crc kubenswrapper[4757]: I1216 13:42:25.297455 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8djct" podStartSLOduration=2.996930408 podStartE2EDuration="21.297430043s" podCreationTimestamp="2025-12-16 13:42:04 +0000 UTC" firstStartedPulling="2025-12-16 13:42:06.07781569 +0000 UTC m=+3311.505559486" lastFinishedPulling="2025-12-16 13:42:24.378315325 +0000 UTC m=+3329.806059121" observedRunningTime="2025-12-16 13:42:25.290589344 +0000 UTC m=+3330.718333140" watchObservedRunningTime="2025-12-16 13:42:25.297430043 +0000 UTC m=+3330.725173839" Dec 16 13:42:34 crc kubenswrapper[4757]: I1216 13:42:34.908669 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8djct" Dec 16 13:42:34 crc kubenswrapper[4757]: I1216 13:42:34.909661 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8djct" Dec 16 13:42:34 crc kubenswrapper[4757]: I1216 13:42:34.977443 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8djct" Dec 16 13:42:35 crc kubenswrapper[4757]: I1216 13:42:35.402195 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8djct" Dec 16 13:42:35 crc kubenswrapper[4757]: I1216 13:42:35.487184 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8djct"] Dec 16 13:42:35 crc kubenswrapper[4757]: I1216 13:42:35.549997 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cdz7w"] Dec 16 13:42:35 crc kubenswrapper[4757]: I1216 13:42:35.550325 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cdz7w" podUID="e7eba443-b255-4c4b-8aad-fc891c2a8a39" containerName="registry-server" containerID="cri-o://a677d00c0ef788983f4fd96b5a392038dcb5e103ff7be25ce29fb57f06ec940e" gracePeriod=2 Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.187966 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdz7w" Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.371416 4757 generic.go:334] "Generic (PLEG): container finished" podID="e7eba443-b255-4c4b-8aad-fc891c2a8a39" containerID="a677d00c0ef788983f4fd96b5a392038dcb5e103ff7be25ce29fb57f06ec940e" exitCode=0 Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.371502 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdz7w" Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.371540 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdz7w" event={"ID":"e7eba443-b255-4c4b-8aad-fc891c2a8a39","Type":"ContainerDied","Data":"a677d00c0ef788983f4fd96b5a392038dcb5e103ff7be25ce29fb57f06ec940e"} Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.372080 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdz7w" event={"ID":"e7eba443-b255-4c4b-8aad-fc891c2a8a39","Type":"ContainerDied","Data":"7837af01ad225ffbad5445991bc829c67dc80fe5513fd218ee281784e0d431da"} Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.372108 4757 scope.go:117] "RemoveContainer" containerID="a677d00c0ef788983f4fd96b5a392038dcb5e103ff7be25ce29fb57f06ec940e" Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.382500 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7eba443-b255-4c4b-8aad-fc891c2a8a39-catalog-content\") pod \"e7eba443-b255-4c4b-8aad-fc891c2a8a39\" (UID: \"e7eba443-b255-4c4b-8aad-fc891c2a8a39\") " Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.382567 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8lrs\" (UniqueName: \"kubernetes.io/projected/e7eba443-b255-4c4b-8aad-fc891c2a8a39-kube-api-access-d8lrs\") pod \"e7eba443-b255-4c4b-8aad-fc891c2a8a39\" (UID: \"e7eba443-b255-4c4b-8aad-fc891c2a8a39\") " Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.382617 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7eba443-b255-4c4b-8aad-fc891c2a8a39-utilities\") pod \"e7eba443-b255-4c4b-8aad-fc891c2a8a39\" (UID: \"e7eba443-b255-4c4b-8aad-fc891c2a8a39\") " Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.387957 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7eba443-b255-4c4b-8aad-fc891c2a8a39-utilities" (OuterVolumeSpecName: "utilities") pod "e7eba443-b255-4c4b-8aad-fc891c2a8a39" (UID: "e7eba443-b255-4c4b-8aad-fc891c2a8a39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.399203 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7eba443-b255-4c4b-8aad-fc891c2a8a39-kube-api-access-d8lrs" (OuterVolumeSpecName: "kube-api-access-d8lrs") pod "e7eba443-b255-4c4b-8aad-fc891c2a8a39" (UID: "e7eba443-b255-4c4b-8aad-fc891c2a8a39"). InnerVolumeSpecName "kube-api-access-d8lrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.412753 4757 scope.go:117] "RemoveContainer" containerID="81899b714702aa11776940648132997a2db202a16a8a076f2bf7302a4e40f1c2" Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.486823 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8lrs\" (UniqueName: \"kubernetes.io/projected/e7eba443-b255-4c4b-8aad-fc891c2a8a39-kube-api-access-d8lrs\") on node \"crc\" DevicePath \"\"" Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.488758 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7eba443-b255-4c4b-8aad-fc891c2a8a39-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.501202 4757 scope.go:117] "RemoveContainer" containerID="80f763398c0908391d7685f4eee18bf4ccc3b347c169906cb37b1744880917a4" Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.551176 4757 scope.go:117] "RemoveContainer" containerID="a677d00c0ef788983f4fd96b5a392038dcb5e103ff7be25ce29fb57f06ec940e" Dec 16 13:42:36 crc kubenswrapper[4757]: E1216 13:42:36.554566 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a677d00c0ef788983f4fd96b5a392038dcb5e103ff7be25ce29fb57f06ec940e\": container with ID starting with a677d00c0ef788983f4fd96b5a392038dcb5e103ff7be25ce29fb57f06ec940e not found: ID does not exist" containerID="a677d00c0ef788983f4fd96b5a392038dcb5e103ff7be25ce29fb57f06ec940e" Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.554609 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a677d00c0ef788983f4fd96b5a392038dcb5e103ff7be25ce29fb57f06ec940e"} err="failed to get container status \"a677d00c0ef788983f4fd96b5a392038dcb5e103ff7be25ce29fb57f06ec940e\": rpc error: code = NotFound desc = could not find container \"a677d00c0ef788983f4fd96b5a392038dcb5e103ff7be25ce29fb57f06ec940e\": container with ID starting with a677d00c0ef788983f4fd96b5a392038dcb5e103ff7be25ce29fb57f06ec940e not found: ID does not exist" Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.554634 4757 scope.go:117] "RemoveContainer" containerID="81899b714702aa11776940648132997a2db202a16a8a076f2bf7302a4e40f1c2" Dec 16 13:42:36 crc kubenswrapper[4757]: E1216 13:42:36.554960 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81899b714702aa11776940648132997a2db202a16a8a076f2bf7302a4e40f1c2\": container with ID starting with 81899b714702aa11776940648132997a2db202a16a8a076f2bf7302a4e40f1c2 not found: ID does not exist" containerID="81899b714702aa11776940648132997a2db202a16a8a076f2bf7302a4e40f1c2" Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.554978 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81899b714702aa11776940648132997a2db202a16a8a076f2bf7302a4e40f1c2"} err="failed to get container status \"81899b714702aa11776940648132997a2db202a16a8a076f2bf7302a4e40f1c2\": rpc error: code = NotFound desc = could not find container \"81899b714702aa11776940648132997a2db202a16a8a076f2bf7302a4e40f1c2\": container with ID starting with 81899b714702aa11776940648132997a2db202a16a8a076f2bf7302a4e40f1c2 not found: ID does not exist" Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.554991 4757 scope.go:117] "RemoveContainer" containerID="80f763398c0908391d7685f4eee18bf4ccc3b347c169906cb37b1744880917a4" Dec 16 13:42:36 crc kubenswrapper[4757]: E1216 13:42:36.555224 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80f763398c0908391d7685f4eee18bf4ccc3b347c169906cb37b1744880917a4\": container with ID starting with 80f763398c0908391d7685f4eee18bf4ccc3b347c169906cb37b1744880917a4 not found: ID does not exist" containerID="80f763398c0908391d7685f4eee18bf4ccc3b347c169906cb37b1744880917a4" Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.555241 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f763398c0908391d7685f4eee18bf4ccc3b347c169906cb37b1744880917a4"} err="failed to get container status \"80f763398c0908391d7685f4eee18bf4ccc3b347c169906cb37b1744880917a4\": rpc error: code = NotFound desc = could not find container \"80f763398c0908391d7685f4eee18bf4ccc3b347c169906cb37b1744880917a4\": container with ID starting with 80f763398c0908391d7685f4eee18bf4ccc3b347c169906cb37b1744880917a4 not found: ID does not exist" Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.615996 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7eba443-b255-4c4b-8aad-fc891c2a8a39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7eba443-b255-4c4b-8aad-fc891c2a8a39" (UID: "e7eba443-b255-4c4b-8aad-fc891c2a8a39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.692473 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7eba443-b255-4c4b-8aad-fc891c2a8a39-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.704757 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cdz7w"] Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.722699 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cdz7w"] Dec 16 13:42:36 crc kubenswrapper[4757]: I1216 13:42:36.961154 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7eba443-b255-4c4b-8aad-fc891c2a8a39" path="/var/lib/kubelet/pods/e7eba443-b255-4c4b-8aad-fc891c2a8a39/volumes" Dec 16 13:42:47 crc kubenswrapper[4757]: E1216 13:42:47.815998 4757 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.110:39170->38.102.83.110:43565: write tcp 38.102.83.110:39170->38.102.83.110:43565: write: broken pipe Dec 16 13:42:51 crc kubenswrapper[4757]: I1216 13:42:51.181102 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:42:51 crc kubenswrapper[4757]: I1216 13:42:51.181516 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.455635 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 16 13:43:20 crc kubenswrapper[4757]: E1216 13:43:20.456766 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7eba443-b255-4c4b-8aad-fc891c2a8a39" containerName="registry-server" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.456789 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7eba443-b255-4c4b-8aad-fc891c2a8a39" containerName="registry-server" Dec 16 13:43:20 crc kubenswrapper[4757]: E1216 13:43:20.456814 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb89db22-f667-4563-9468-97cd48c1da89" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.456824 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb89db22-f667-4563-9468-97cd48c1da89" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 16 13:43:20 crc kubenswrapper[4757]: E1216 13:43:20.457031 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7eba443-b255-4c4b-8aad-fc891c2a8a39" containerName="extract-content" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.457044 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7eba443-b255-4c4b-8aad-fc891c2a8a39" containerName="extract-content" Dec 16 13:43:20 crc kubenswrapper[4757]: E1216 13:43:20.457081 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7eba443-b255-4c4b-8aad-fc891c2a8a39" containerName="extract-utilities" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.457091 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7eba443-b255-4c4b-8aad-fc891c2a8a39" containerName="extract-utilities" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.457350 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb89db22-f667-4563-9468-97cd48c1da89" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.457375 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7eba443-b255-4c4b-8aad-fc891c2a8a39" containerName="registry-server" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.458575 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.462390 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.463944 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5zkmx" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.463998 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.464205 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.473106 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.588695 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.589152 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jsjc\" (UniqueName: \"kubernetes.io/projected/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-kube-api-access-7jsjc\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.589225 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.589457 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.589522 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.589600 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.589735 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.589816 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-config-data\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.589911 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.692306 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.692396 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.692435 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-config-data\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.692474 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.692513 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.692546 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jsjc\" (UniqueName: \"kubernetes.io/projected/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-kube-api-access-7jsjc\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.692617 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.693107 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.693138 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.693634 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.694103 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.694681 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-config-data\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.694752 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.697196 4757 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.699601 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.700820 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.704191 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.712116 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jsjc\" (UniqueName: \"kubernetes.io/projected/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-kube-api-access-7jsjc\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.723182 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " pod="openstack/tempest-tests-tempest" Dec 16 13:43:20 crc kubenswrapper[4757]: I1216 13:43:20.794328 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 16 13:43:21 crc kubenswrapper[4757]: I1216 13:43:21.181883 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:43:21 crc kubenswrapper[4757]: I1216 13:43:21.182257 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:43:21 crc kubenswrapper[4757]: I1216 13:43:21.182317 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 13:43:21 crc kubenswrapper[4757]: I1216 13:43:21.183099 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ba36603b2ede44048e78f416348abfebbffb6f1388fd687a5fe481e6d72c629"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 13:43:21 crc kubenswrapper[4757]: I1216 13:43:21.183162 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://6ba36603b2ede44048e78f416348abfebbffb6f1388fd687a5fe481e6d72c629" gracePeriod=600 Dec 16 13:43:21 crc kubenswrapper[4757]: I1216 13:43:21.267506 4757 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 13:43:21 crc kubenswrapper[4757]: I1216 13:43:21.272232 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 16 13:43:21 crc kubenswrapper[4757]: I1216 13:43:21.787026 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8","Type":"ContainerStarted","Data":"35bff818b8e3b66d15c746f24a2968c478639494de5fdf3121527e5b27a4c836"} Dec 16 13:43:21 crc kubenswrapper[4757]: I1216 13:43:21.793460 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="6ba36603b2ede44048e78f416348abfebbffb6f1388fd687a5fe481e6d72c629" exitCode=0 Dec 16 13:43:21 crc kubenswrapper[4757]: I1216 13:43:21.793506 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"6ba36603b2ede44048e78f416348abfebbffb6f1388fd687a5fe481e6d72c629"} Dec 16 13:43:21 crc kubenswrapper[4757]: I1216 13:43:21.793534 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a"} Dec 16 13:43:21 crc kubenswrapper[4757]: I1216 13:43:21.793550 4757 scope.go:117] "RemoveContainer" containerID="52fa98cc873e24a9657be926211f890c0c2f27abca5ea0e03a30e7312c1653ec" Dec 16 13:44:08 crc kubenswrapper[4757]: I1216 13:44:08.029677 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vp5jc"] Dec 16 13:44:08 crc kubenswrapper[4757]: I1216 13:44:08.032126 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vp5jc" Dec 16 13:44:08 crc kubenswrapper[4757]: I1216 13:44:08.048098 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vp5jc"] Dec 16 13:44:08 crc kubenswrapper[4757]: I1216 13:44:08.183579 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d13d4bbc-18fa-4caa-9589-76c821d8ee6d-catalog-content\") pod \"redhat-marketplace-vp5jc\" (UID: \"d13d4bbc-18fa-4caa-9589-76c821d8ee6d\") " pod="openshift-marketplace/redhat-marketplace-vp5jc" Dec 16 13:44:08 crc kubenswrapper[4757]: I1216 13:44:08.183742 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nltsq\" (UniqueName: \"kubernetes.io/projected/d13d4bbc-18fa-4caa-9589-76c821d8ee6d-kube-api-access-nltsq\") pod \"redhat-marketplace-vp5jc\" (UID: \"d13d4bbc-18fa-4caa-9589-76c821d8ee6d\") " pod="openshift-marketplace/redhat-marketplace-vp5jc" Dec 16 13:44:08 crc kubenswrapper[4757]: I1216 13:44:08.184074 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d13d4bbc-18fa-4caa-9589-76c821d8ee6d-utilities\") pod \"redhat-marketplace-vp5jc\" (UID: \"d13d4bbc-18fa-4caa-9589-76c821d8ee6d\") " pod="openshift-marketplace/redhat-marketplace-vp5jc" Dec 16 13:44:08 crc kubenswrapper[4757]: I1216 13:44:08.286533 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nltsq\" (UniqueName: \"kubernetes.io/projected/d13d4bbc-18fa-4caa-9589-76c821d8ee6d-kube-api-access-nltsq\") pod \"redhat-marketplace-vp5jc\" (UID: \"d13d4bbc-18fa-4caa-9589-76c821d8ee6d\") " pod="openshift-marketplace/redhat-marketplace-vp5jc" Dec 16 13:44:08 crc kubenswrapper[4757]: I1216 13:44:08.286641 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d13d4bbc-18fa-4caa-9589-76c821d8ee6d-utilities\") pod \"redhat-marketplace-vp5jc\" (UID: \"d13d4bbc-18fa-4caa-9589-76c821d8ee6d\") " pod="openshift-marketplace/redhat-marketplace-vp5jc" Dec 16 13:44:08 crc kubenswrapper[4757]: I1216 13:44:08.286689 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d13d4bbc-18fa-4caa-9589-76c821d8ee6d-catalog-content\") pod \"redhat-marketplace-vp5jc\" (UID: \"d13d4bbc-18fa-4caa-9589-76c821d8ee6d\") " pod="openshift-marketplace/redhat-marketplace-vp5jc" Dec 16 13:44:08 crc kubenswrapper[4757]: I1216 13:44:08.287228 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d13d4bbc-18fa-4caa-9589-76c821d8ee6d-catalog-content\") pod \"redhat-marketplace-vp5jc\" (UID: \"d13d4bbc-18fa-4caa-9589-76c821d8ee6d\") " pod="openshift-marketplace/redhat-marketplace-vp5jc" Dec 16 13:44:08 crc kubenswrapper[4757]: I1216 13:44:08.287432 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d13d4bbc-18fa-4caa-9589-76c821d8ee6d-utilities\") pod \"redhat-marketplace-vp5jc\" (UID: \"d13d4bbc-18fa-4caa-9589-76c821d8ee6d\") " pod="openshift-marketplace/redhat-marketplace-vp5jc" Dec 16 13:44:08 crc kubenswrapper[4757]: I1216 13:44:08.329889 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nltsq\" (UniqueName: \"kubernetes.io/projected/d13d4bbc-18fa-4caa-9589-76c821d8ee6d-kube-api-access-nltsq\") pod \"redhat-marketplace-vp5jc\" (UID: \"d13d4bbc-18fa-4caa-9589-76c821d8ee6d\") " pod="openshift-marketplace/redhat-marketplace-vp5jc" Dec 16 13:44:08 crc kubenswrapper[4757]: I1216 13:44:08.350521 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vp5jc" Dec 16 13:44:11 crc kubenswrapper[4757]: E1216 13:44:11.976043 4757 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 16 13:44:11 crc kubenswrapper[4757]: E1216 13:44:11.976999 4757 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7jsjc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(d2802d44-5cd2-4f45-80b0-d423d3ab6ea8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 13:44:11 crc kubenswrapper[4757]: E1216 13:44:11.978176 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="d2802d44-5cd2-4f45-80b0-d423d3ab6ea8" Dec 16 13:44:12 crc kubenswrapper[4757]: W1216 13:44:12.184717 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd13d4bbc_18fa_4caa_9589_76c821d8ee6d.slice/crio-4e06f6dc867239099313d7fc908f5e5fe23b4d45ed535d8f7384b42e30eccac2 WatchSource:0}: Error finding container 4e06f6dc867239099313d7fc908f5e5fe23b4d45ed535d8f7384b42e30eccac2: Status 404 returned error can't find the container with id 4e06f6dc867239099313d7fc908f5e5fe23b4d45ed535d8f7384b42e30eccac2 Dec 16 13:44:12 crc kubenswrapper[4757]: I1216 13:44:12.204081 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vp5jc"] Dec 16 13:44:12 crc kubenswrapper[4757]: I1216 13:44:12.276736 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vp5jc" event={"ID":"d13d4bbc-18fa-4caa-9589-76c821d8ee6d","Type":"ContainerStarted","Data":"4e06f6dc867239099313d7fc908f5e5fe23b4d45ed535d8f7384b42e30eccac2"} Dec 16 13:44:12 crc kubenswrapper[4757]: E1216 13:44:12.278551 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="d2802d44-5cd2-4f45-80b0-d423d3ab6ea8" Dec 16 13:44:13 crc kubenswrapper[4757]: I1216 13:44:13.289606 4757 generic.go:334] "Generic (PLEG): container finished" podID="d13d4bbc-18fa-4caa-9589-76c821d8ee6d" containerID="54c4f419797e215661671a683776c571126b963e63c21e1e0e414f9434ad7819" exitCode=0 Dec 16 13:44:13 crc kubenswrapper[4757]: I1216 13:44:13.289685 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vp5jc" event={"ID":"d13d4bbc-18fa-4caa-9589-76c821d8ee6d","Type":"ContainerDied","Data":"54c4f419797e215661671a683776c571126b963e63c21e1e0e414f9434ad7819"} Dec 16 13:44:15 crc kubenswrapper[4757]: I1216 13:44:15.325863 4757 generic.go:334] "Generic (PLEG): container finished" podID="d13d4bbc-18fa-4caa-9589-76c821d8ee6d" containerID="848a79e45ec4897d7ede60e6ff22631f676b22600e82f0e303b57f6629a56b40" exitCode=0 Dec 16 13:44:15 crc kubenswrapper[4757]: I1216 13:44:15.325948 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vp5jc" event={"ID":"d13d4bbc-18fa-4caa-9589-76c821d8ee6d","Type":"ContainerDied","Data":"848a79e45ec4897d7ede60e6ff22631f676b22600e82f0e303b57f6629a56b40"} Dec 16 13:44:17 crc kubenswrapper[4757]: I1216 13:44:17.344564 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vp5jc" event={"ID":"d13d4bbc-18fa-4caa-9589-76c821d8ee6d","Type":"ContainerStarted","Data":"2f73121618240f793e409acf49a79602a2889b9f13a7a8af764604433e5d4838"} Dec 16 13:44:17 crc kubenswrapper[4757]: I1216 13:44:17.370220 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vp5jc" podStartSLOduration=6.610077778 podStartE2EDuration="9.370199321s" podCreationTimestamp="2025-12-16 13:44:08 +0000 UTC" firstStartedPulling="2025-12-16 13:44:13.293468685 +0000 UTC m=+3438.721212491" lastFinishedPulling="2025-12-16 13:44:16.053590238 +0000 UTC m=+3441.481334034" observedRunningTime="2025-12-16 13:44:17.362674754 +0000 UTC m=+3442.790418550" watchObservedRunningTime="2025-12-16 13:44:17.370199321 +0000 UTC m=+3442.797943117" Dec 16 13:44:18 crc kubenswrapper[4757]: I1216 13:44:18.351487 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vp5jc" Dec 16 13:44:18 crc kubenswrapper[4757]: I1216 13:44:18.351535 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vp5jc" Dec 16 13:44:18 crc kubenswrapper[4757]: I1216 13:44:18.405865 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vp5jc" Dec 16 13:44:28 crc kubenswrapper[4757]: I1216 13:44:28.400630 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vp5jc" Dec 16 13:44:28 crc kubenswrapper[4757]: I1216 13:44:28.455700 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vp5jc"] Dec 16 13:44:28 crc kubenswrapper[4757]: I1216 13:44:28.455971 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vp5jc" podUID="d13d4bbc-18fa-4caa-9589-76c821d8ee6d" containerName="registry-server" containerID="cri-o://2f73121618240f793e409acf49a79602a2889b9f13a7a8af764604433e5d4838" gracePeriod=2 Dec 16 13:44:28 crc kubenswrapper[4757]: I1216 13:44:28.979130 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 16 13:44:29 crc kubenswrapper[4757]: I1216 13:44:29.445193 4757 generic.go:334] "Generic (PLEG): container finished" podID="d13d4bbc-18fa-4caa-9589-76c821d8ee6d" containerID="2f73121618240f793e409acf49a79602a2889b9f13a7a8af764604433e5d4838" exitCode=0 Dec 16 13:44:29 crc kubenswrapper[4757]: I1216 13:44:29.445237 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vp5jc" event={"ID":"d13d4bbc-18fa-4caa-9589-76c821d8ee6d","Type":"ContainerDied","Data":"2f73121618240f793e409acf49a79602a2889b9f13a7a8af764604433e5d4838"} Dec 16 13:44:29 crc kubenswrapper[4757]: I1216 13:44:29.466220 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vp5jc" Dec 16 13:44:29 crc kubenswrapper[4757]: I1216 13:44:29.648078 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d13d4bbc-18fa-4caa-9589-76c821d8ee6d-catalog-content\") pod \"d13d4bbc-18fa-4caa-9589-76c821d8ee6d\" (UID: \"d13d4bbc-18fa-4caa-9589-76c821d8ee6d\") " Dec 16 13:44:29 crc kubenswrapper[4757]: I1216 13:44:29.648200 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nltsq\" (UniqueName: \"kubernetes.io/projected/d13d4bbc-18fa-4caa-9589-76c821d8ee6d-kube-api-access-nltsq\") pod \"d13d4bbc-18fa-4caa-9589-76c821d8ee6d\" (UID: \"d13d4bbc-18fa-4caa-9589-76c821d8ee6d\") " Dec 16 13:44:29 crc kubenswrapper[4757]: I1216 13:44:29.648257 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d13d4bbc-18fa-4caa-9589-76c821d8ee6d-utilities\") pod \"d13d4bbc-18fa-4caa-9589-76c821d8ee6d\" (UID: \"d13d4bbc-18fa-4caa-9589-76c821d8ee6d\") " Dec 16 13:44:29 crc kubenswrapper[4757]: I1216 13:44:29.649577 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d13d4bbc-18fa-4caa-9589-76c821d8ee6d-utilities" (OuterVolumeSpecName: "utilities") pod "d13d4bbc-18fa-4caa-9589-76c821d8ee6d" (UID: "d13d4bbc-18fa-4caa-9589-76c821d8ee6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:44:29 crc kubenswrapper[4757]: I1216 13:44:29.652095 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13d4bbc-18fa-4caa-9589-76c821d8ee6d-kube-api-access-nltsq" (OuterVolumeSpecName: "kube-api-access-nltsq") pod "d13d4bbc-18fa-4caa-9589-76c821d8ee6d" (UID: "d13d4bbc-18fa-4caa-9589-76c821d8ee6d"). InnerVolumeSpecName "kube-api-access-nltsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:44:29 crc kubenswrapper[4757]: I1216 13:44:29.670479 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d13d4bbc-18fa-4caa-9589-76c821d8ee6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d13d4bbc-18fa-4caa-9589-76c821d8ee6d" (UID: "d13d4bbc-18fa-4caa-9589-76c821d8ee6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:44:29 crc kubenswrapper[4757]: I1216 13:44:29.755943 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nltsq\" (UniqueName: \"kubernetes.io/projected/d13d4bbc-18fa-4caa-9589-76c821d8ee6d-kube-api-access-nltsq\") on node \"crc\" DevicePath \"\"" Dec 16 13:44:29 crc kubenswrapper[4757]: I1216 13:44:29.755982 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d13d4bbc-18fa-4caa-9589-76c821d8ee6d-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:44:29 crc kubenswrapper[4757]: I1216 13:44:29.755991 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d13d4bbc-18fa-4caa-9589-76c821d8ee6d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:44:30 crc kubenswrapper[4757]: I1216 13:44:30.458409 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vp5jc" event={"ID":"d13d4bbc-18fa-4caa-9589-76c821d8ee6d","Type":"ContainerDied","Data":"4e06f6dc867239099313d7fc908f5e5fe23b4d45ed535d8f7384b42e30eccac2"} Dec 16 13:44:30 crc kubenswrapper[4757]: I1216 13:44:30.458827 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vp5jc" Dec 16 13:44:30 crc kubenswrapper[4757]: I1216 13:44:30.458858 4757 scope.go:117] "RemoveContainer" containerID="2f73121618240f793e409acf49a79602a2889b9f13a7a8af764604433e5d4838" Dec 16 13:44:30 crc kubenswrapper[4757]: I1216 13:44:30.505619 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vp5jc"] Dec 16 13:44:30 crc kubenswrapper[4757]: I1216 13:44:30.509047 4757 scope.go:117] "RemoveContainer" containerID="848a79e45ec4897d7ede60e6ff22631f676b22600e82f0e303b57f6629a56b40" Dec 16 13:44:30 crc kubenswrapper[4757]: I1216 13:44:30.514941 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vp5jc"] Dec 16 13:44:30 crc kubenswrapper[4757]: I1216 13:44:30.538249 4757 scope.go:117] "RemoveContainer" containerID="54c4f419797e215661671a683776c571126b963e63c21e1e0e414f9434ad7819" Dec 16 13:44:30 crc kubenswrapper[4757]: I1216 13:44:30.967149 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13d4bbc-18fa-4caa-9589-76c821d8ee6d" path="/var/lib/kubelet/pods/d13d4bbc-18fa-4caa-9589-76c821d8ee6d/volumes" Dec 16 13:44:31 crc kubenswrapper[4757]: I1216 13:44:31.469383 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8","Type":"ContainerStarted","Data":"8dd50f318ea809b86ce4b54c8f97e070e5be854c7486dfe17a7d66471b8752df"} Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.176427 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=33.466734373 podStartE2EDuration="1m41.176409304s" podCreationTimestamp="2025-12-16 13:43:19 +0000 UTC" firstStartedPulling="2025-12-16 13:43:21.266924238 +0000 UTC m=+3386.694668034" lastFinishedPulling="2025-12-16 13:44:28.976599169 +0000 UTC m=+3454.404342965" observedRunningTime="2025-12-16 13:44:32.504711601 +0000 UTC m=+3457.932455397" watchObservedRunningTime="2025-12-16 13:45:00.176409304 +0000 UTC m=+3485.604153120" Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.184813 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c"] Dec 16 13:45:00 crc kubenswrapper[4757]: E1216 13:45:00.185295 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13d4bbc-18fa-4caa-9589-76c821d8ee6d" containerName="extract-utilities" Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.185311 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13d4bbc-18fa-4caa-9589-76c821d8ee6d" containerName="extract-utilities" Dec 16 13:45:00 crc kubenswrapper[4757]: E1216 13:45:00.185334 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13d4bbc-18fa-4caa-9589-76c821d8ee6d" containerName="extract-content" Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.185340 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13d4bbc-18fa-4caa-9589-76c821d8ee6d" containerName="extract-content" Dec 16 13:45:00 crc kubenswrapper[4757]: E1216 13:45:00.185369 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13d4bbc-18fa-4caa-9589-76c821d8ee6d" containerName="registry-server" Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.185375 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13d4bbc-18fa-4caa-9589-76c821d8ee6d" containerName="registry-server" Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.185567 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="d13d4bbc-18fa-4caa-9589-76c821d8ee6d" containerName="registry-server" Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.186420 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c" Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.188875 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.195544 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.203343 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c"] Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.307871 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf74f570-2f9c-4e54-bc70-3f84c288cbc4-config-volume\") pod \"collect-profiles-29431545-9fx4c\" (UID: \"cf74f570-2f9c-4e54-bc70-3f84c288cbc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c" Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.308216 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnspz\" (UniqueName: \"kubernetes.io/projected/cf74f570-2f9c-4e54-bc70-3f84c288cbc4-kube-api-access-lnspz\") pod \"collect-profiles-29431545-9fx4c\" (UID: \"cf74f570-2f9c-4e54-bc70-3f84c288cbc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c" Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.308578 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf74f570-2f9c-4e54-bc70-3f84c288cbc4-secret-volume\") pod \"collect-profiles-29431545-9fx4c\" (UID: \"cf74f570-2f9c-4e54-bc70-3f84c288cbc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c" Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.411085 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf74f570-2f9c-4e54-bc70-3f84c288cbc4-secret-volume\") pod \"collect-profiles-29431545-9fx4c\" (UID: \"cf74f570-2f9c-4e54-bc70-3f84c288cbc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c" Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.411191 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf74f570-2f9c-4e54-bc70-3f84c288cbc4-config-volume\") pod \"collect-profiles-29431545-9fx4c\" (UID: \"cf74f570-2f9c-4e54-bc70-3f84c288cbc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c" Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.411276 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnspz\" (UniqueName: \"kubernetes.io/projected/cf74f570-2f9c-4e54-bc70-3f84c288cbc4-kube-api-access-lnspz\") pod \"collect-profiles-29431545-9fx4c\" (UID: \"cf74f570-2f9c-4e54-bc70-3f84c288cbc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c" Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.412204 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf74f570-2f9c-4e54-bc70-3f84c288cbc4-config-volume\") pod \"collect-profiles-29431545-9fx4c\" (UID: \"cf74f570-2f9c-4e54-bc70-3f84c288cbc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c" Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.430896 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf74f570-2f9c-4e54-bc70-3f84c288cbc4-secret-volume\") pod \"collect-profiles-29431545-9fx4c\" (UID: \"cf74f570-2f9c-4e54-bc70-3f84c288cbc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c" Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.453737 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnspz\" (UniqueName: \"kubernetes.io/projected/cf74f570-2f9c-4e54-bc70-3f84c288cbc4-kube-api-access-lnspz\") pod \"collect-profiles-29431545-9fx4c\" (UID: \"cf74f570-2f9c-4e54-bc70-3f84c288cbc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c" Dec 16 13:45:00 crc kubenswrapper[4757]: I1216 13:45:00.513466 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c" Dec 16 13:45:01 crc kubenswrapper[4757]: I1216 13:45:01.029481 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c"] Dec 16 13:45:01 crc kubenswrapper[4757]: W1216 13:45:01.056312 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf74f570_2f9c_4e54_bc70_3f84c288cbc4.slice/crio-70af8638a980797c8da1517f2aadb1f00f1be20514322da79d72d0f26964284c WatchSource:0}: Error finding container 70af8638a980797c8da1517f2aadb1f00f1be20514322da79d72d0f26964284c: Status 404 returned error can't find the container with id 70af8638a980797c8da1517f2aadb1f00f1be20514322da79d72d0f26964284c Dec 16 13:45:01 crc kubenswrapper[4757]: I1216 13:45:01.741795 4757 generic.go:334] "Generic (PLEG): container finished" podID="cf74f570-2f9c-4e54-bc70-3f84c288cbc4" containerID="31b75ff9b29ae80230dd98acdfb4f6079cb5d9784ac9e225bf7e24f0fa01b4e2" exitCode=0 Dec 16 13:45:01 crc kubenswrapper[4757]: I1216 13:45:01.741854 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c" event={"ID":"cf74f570-2f9c-4e54-bc70-3f84c288cbc4","Type":"ContainerDied","Data":"31b75ff9b29ae80230dd98acdfb4f6079cb5d9784ac9e225bf7e24f0fa01b4e2"} Dec 16 13:45:01 crc kubenswrapper[4757]: I1216 13:45:01.742041 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c" event={"ID":"cf74f570-2f9c-4e54-bc70-3f84c288cbc4","Type":"ContainerStarted","Data":"70af8638a980797c8da1517f2aadb1f00f1be20514322da79d72d0f26964284c"} Dec 16 13:45:03 crc kubenswrapper[4757]: I1216 13:45:03.102058 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c" Dec 16 13:45:03 crc kubenswrapper[4757]: I1216 13:45:03.169326 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf74f570-2f9c-4e54-bc70-3f84c288cbc4-secret-volume\") pod \"cf74f570-2f9c-4e54-bc70-3f84c288cbc4\" (UID: \"cf74f570-2f9c-4e54-bc70-3f84c288cbc4\") " Dec 16 13:45:03 crc kubenswrapper[4757]: I1216 13:45:03.169713 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf74f570-2f9c-4e54-bc70-3f84c288cbc4-config-volume\") pod \"cf74f570-2f9c-4e54-bc70-3f84c288cbc4\" (UID: \"cf74f570-2f9c-4e54-bc70-3f84c288cbc4\") " Dec 16 13:45:03 crc kubenswrapper[4757]: I1216 13:45:03.169811 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnspz\" (UniqueName: \"kubernetes.io/projected/cf74f570-2f9c-4e54-bc70-3f84c288cbc4-kube-api-access-lnspz\") pod \"cf74f570-2f9c-4e54-bc70-3f84c288cbc4\" (UID: \"cf74f570-2f9c-4e54-bc70-3f84c288cbc4\") " Dec 16 13:45:03 crc kubenswrapper[4757]: I1216 13:45:03.172749 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf74f570-2f9c-4e54-bc70-3f84c288cbc4-config-volume" (OuterVolumeSpecName: "config-volume") pod "cf74f570-2f9c-4e54-bc70-3f84c288cbc4" (UID: "cf74f570-2f9c-4e54-bc70-3f84c288cbc4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 13:45:03 crc kubenswrapper[4757]: I1216 13:45:03.187288 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf74f570-2f9c-4e54-bc70-3f84c288cbc4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cf74f570-2f9c-4e54-bc70-3f84c288cbc4" (UID: "cf74f570-2f9c-4e54-bc70-3f84c288cbc4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 13:45:03 crc kubenswrapper[4757]: I1216 13:45:03.187302 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf74f570-2f9c-4e54-bc70-3f84c288cbc4-kube-api-access-lnspz" (OuterVolumeSpecName: "kube-api-access-lnspz") pod "cf74f570-2f9c-4e54-bc70-3f84c288cbc4" (UID: "cf74f570-2f9c-4e54-bc70-3f84c288cbc4"). InnerVolumeSpecName "kube-api-access-lnspz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:45:03 crc kubenswrapper[4757]: I1216 13:45:03.271573 4757 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf74f570-2f9c-4e54-bc70-3f84c288cbc4-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 13:45:03 crc kubenswrapper[4757]: I1216 13:45:03.271934 4757 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf74f570-2f9c-4e54-bc70-3f84c288cbc4-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 13:45:03 crc kubenswrapper[4757]: I1216 13:45:03.271945 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnspz\" (UniqueName: \"kubernetes.io/projected/cf74f570-2f9c-4e54-bc70-3f84c288cbc4-kube-api-access-lnspz\") on node \"crc\" DevicePath \"\"" Dec 16 13:45:03 crc kubenswrapper[4757]: I1216 13:45:03.765167 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c" event={"ID":"cf74f570-2f9c-4e54-bc70-3f84c288cbc4","Type":"ContainerDied","Data":"70af8638a980797c8da1517f2aadb1f00f1be20514322da79d72d0f26964284c"} Dec 16 13:45:03 crc kubenswrapper[4757]: I1216 13:45:03.765226 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70af8638a980797c8da1517f2aadb1f00f1be20514322da79d72d0f26964284c" Dec 16 13:45:03 crc kubenswrapper[4757]: I1216 13:45:03.765300 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431545-9fx4c" Dec 16 13:45:04 crc kubenswrapper[4757]: I1216 13:45:04.190126 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj"] Dec 16 13:45:04 crc kubenswrapper[4757]: I1216 13:45:04.197877 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431500-m5mqj"] Dec 16 13:45:04 crc kubenswrapper[4757]: I1216 13:45:04.960839 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04c39a6c-e3d7-411a-bc03-f65ff0fe6167" path="/var/lib/kubelet/pods/04c39a6c-e3d7-411a-bc03-f65ff0fe6167/volumes" Dec 16 13:45:21 crc kubenswrapper[4757]: I1216 13:45:21.181169 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:45:21 crc kubenswrapper[4757]: I1216 13:45:21.181686 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:45:25 crc kubenswrapper[4757]: I1216 13:45:25.165352 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s6rgr"] Dec 16 13:45:25 crc kubenswrapper[4757]: E1216 13:45:25.166232 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf74f570-2f9c-4e54-bc70-3f84c288cbc4" containerName="collect-profiles" Dec 16 13:45:25 crc kubenswrapper[4757]: I1216 13:45:25.166244 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf74f570-2f9c-4e54-bc70-3f84c288cbc4" containerName="collect-profiles" Dec 16 13:45:25 crc kubenswrapper[4757]: I1216 13:45:25.166461 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf74f570-2f9c-4e54-bc70-3f84c288cbc4" containerName="collect-profiles" Dec 16 13:45:25 crc kubenswrapper[4757]: I1216 13:45:25.167709 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6rgr" Dec 16 13:45:25 crc kubenswrapper[4757]: I1216 13:45:25.193605 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s6rgr"] Dec 16 13:45:25 crc kubenswrapper[4757]: I1216 13:45:25.277338 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399dde83-8f15-4edd-aa37-6c0cfd0d3eef-utilities\") pod \"community-operators-s6rgr\" (UID: \"399dde83-8f15-4edd-aa37-6c0cfd0d3eef\") " pod="openshift-marketplace/community-operators-s6rgr" Dec 16 13:45:25 crc kubenswrapper[4757]: I1216 13:45:25.277390 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s2bw\" (UniqueName: \"kubernetes.io/projected/399dde83-8f15-4edd-aa37-6c0cfd0d3eef-kube-api-access-2s2bw\") pod \"community-operators-s6rgr\" (UID: \"399dde83-8f15-4edd-aa37-6c0cfd0d3eef\") " pod="openshift-marketplace/community-operators-s6rgr" Dec 16 13:45:25 crc kubenswrapper[4757]: I1216 13:45:25.277444 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399dde83-8f15-4edd-aa37-6c0cfd0d3eef-catalog-content\") pod \"community-operators-s6rgr\" (UID: \"399dde83-8f15-4edd-aa37-6c0cfd0d3eef\") " pod="openshift-marketplace/community-operators-s6rgr" Dec 16 13:45:25 crc kubenswrapper[4757]: I1216 13:45:25.379459 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399dde83-8f15-4edd-aa37-6c0cfd0d3eef-utilities\") pod \"community-operators-s6rgr\" (UID: \"399dde83-8f15-4edd-aa37-6c0cfd0d3eef\") " pod="openshift-marketplace/community-operators-s6rgr" Dec 16 13:45:25 crc kubenswrapper[4757]: I1216 13:45:25.379508 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s2bw\" (UniqueName: \"kubernetes.io/projected/399dde83-8f15-4edd-aa37-6c0cfd0d3eef-kube-api-access-2s2bw\") pod \"community-operators-s6rgr\" (UID: \"399dde83-8f15-4edd-aa37-6c0cfd0d3eef\") " pod="openshift-marketplace/community-operators-s6rgr" Dec 16 13:45:25 crc kubenswrapper[4757]: I1216 13:45:25.379571 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399dde83-8f15-4edd-aa37-6c0cfd0d3eef-catalog-content\") pod \"community-operators-s6rgr\" (UID: \"399dde83-8f15-4edd-aa37-6c0cfd0d3eef\") " pod="openshift-marketplace/community-operators-s6rgr" Dec 16 13:45:25 crc kubenswrapper[4757]: I1216 13:45:25.379927 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399dde83-8f15-4edd-aa37-6c0cfd0d3eef-utilities\") pod \"community-operators-s6rgr\" (UID: \"399dde83-8f15-4edd-aa37-6c0cfd0d3eef\") " pod="openshift-marketplace/community-operators-s6rgr" Dec 16 13:45:25 crc kubenswrapper[4757]: I1216 13:45:25.380039 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399dde83-8f15-4edd-aa37-6c0cfd0d3eef-catalog-content\") pod \"community-operators-s6rgr\" (UID: \"399dde83-8f15-4edd-aa37-6c0cfd0d3eef\") " pod="openshift-marketplace/community-operators-s6rgr" Dec 16 13:45:25 crc kubenswrapper[4757]: I1216 13:45:25.413087 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s2bw\" (UniqueName: \"kubernetes.io/projected/399dde83-8f15-4edd-aa37-6c0cfd0d3eef-kube-api-access-2s2bw\") pod \"community-operators-s6rgr\" (UID: \"399dde83-8f15-4edd-aa37-6c0cfd0d3eef\") " pod="openshift-marketplace/community-operators-s6rgr" Dec 16 13:45:25 crc kubenswrapper[4757]: I1216 13:45:25.488711 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6rgr" Dec 16 13:45:26 crc kubenswrapper[4757]: I1216 13:45:26.174301 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s6rgr"] Dec 16 13:45:26 crc kubenswrapper[4757]: W1216 13:45:26.181484 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod399dde83_8f15_4edd_aa37_6c0cfd0d3eef.slice/crio-586ed3c49a3b99c9c4d68ccab91ad29153bf4100fa9412bfaa0ba70027fa14a3 WatchSource:0}: Error finding container 586ed3c49a3b99c9c4d68ccab91ad29153bf4100fa9412bfaa0ba70027fa14a3: Status 404 returned error can't find the container with id 586ed3c49a3b99c9c4d68ccab91ad29153bf4100fa9412bfaa0ba70027fa14a3 Dec 16 13:45:26 crc kubenswrapper[4757]: I1216 13:45:26.978581 4757 generic.go:334] "Generic (PLEG): container finished" podID="399dde83-8f15-4edd-aa37-6c0cfd0d3eef" containerID="1ee77686609649abaae5345a4f4eee56ce0988caceb8732662709833b0c132d5" exitCode=0 Dec 16 13:45:26 crc kubenswrapper[4757]: I1216 13:45:26.978805 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6rgr" event={"ID":"399dde83-8f15-4edd-aa37-6c0cfd0d3eef","Type":"ContainerDied","Data":"1ee77686609649abaae5345a4f4eee56ce0988caceb8732662709833b0c132d5"} Dec 16 13:45:26 crc kubenswrapper[4757]: I1216 13:45:26.978946 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6rgr" event={"ID":"399dde83-8f15-4edd-aa37-6c0cfd0d3eef","Type":"ContainerStarted","Data":"586ed3c49a3b99c9c4d68ccab91ad29153bf4100fa9412bfaa0ba70027fa14a3"} Dec 16 13:45:28 crc kubenswrapper[4757]: I1216 13:45:28.998567 4757 generic.go:334] "Generic (PLEG): container finished" podID="399dde83-8f15-4edd-aa37-6c0cfd0d3eef" containerID="305fbbc309ad582fb0451d2459e30d202cbb2772fe15331f5ac79d744861c655" exitCode=0 Dec 16 13:45:29 crc kubenswrapper[4757]: I1216 13:45:28.998794 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6rgr" event={"ID":"399dde83-8f15-4edd-aa37-6c0cfd0d3eef","Type":"ContainerDied","Data":"305fbbc309ad582fb0451d2459e30d202cbb2772fe15331f5ac79d744861c655"} Dec 16 13:45:30 crc kubenswrapper[4757]: I1216 13:45:30.009603 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6rgr" event={"ID":"399dde83-8f15-4edd-aa37-6c0cfd0d3eef","Type":"ContainerStarted","Data":"9fea91e791e8f5b12e94dcc38dc9309017eea744f39ceb104cf3f91601a0ee21"} Dec 16 13:45:30 crc kubenswrapper[4757]: I1216 13:45:30.066746 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s6rgr" podStartSLOduration=2.490741387 podStartE2EDuration="5.066720744s" podCreationTimestamp="2025-12-16 13:45:25 +0000 UTC" firstStartedPulling="2025-12-16 13:45:26.980624089 +0000 UTC m=+3512.408367885" lastFinishedPulling="2025-12-16 13:45:29.556603446 +0000 UTC m=+3514.984347242" observedRunningTime="2025-12-16 13:45:30.047839793 +0000 UTC m=+3515.475583599" watchObservedRunningTime="2025-12-16 13:45:30.066720744 +0000 UTC m=+3515.494464540" Dec 16 13:45:31 crc kubenswrapper[4757]: I1216 13:45:31.539110 4757 scope.go:117] "RemoveContainer" containerID="c768050d1fed98fc8472ee568d189da5c7c0a1d5cd0b8013b47b7399df75f307" Dec 16 13:45:35 crc kubenswrapper[4757]: I1216 13:45:35.489186 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s6rgr" Dec 16 13:45:35 crc kubenswrapper[4757]: I1216 13:45:35.489777 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s6rgr" Dec 16 13:45:35 crc kubenswrapper[4757]: I1216 13:45:35.547430 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s6rgr" Dec 16 13:45:36 crc kubenswrapper[4757]: I1216 13:45:36.131290 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s6rgr" Dec 16 13:45:36 crc kubenswrapper[4757]: I1216 13:45:36.190045 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s6rgr"] Dec 16 13:45:38 crc kubenswrapper[4757]: I1216 13:45:38.091337 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s6rgr" podUID="399dde83-8f15-4edd-aa37-6c0cfd0d3eef" containerName="registry-server" containerID="cri-o://9fea91e791e8f5b12e94dcc38dc9309017eea744f39ceb104cf3f91601a0ee21" gracePeriod=2 Dec 16 13:45:38 crc kubenswrapper[4757]: I1216 13:45:38.707797 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6rgr" Dec 16 13:45:38 crc kubenswrapper[4757]: I1216 13:45:38.871166 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399dde83-8f15-4edd-aa37-6c0cfd0d3eef-catalog-content\") pod \"399dde83-8f15-4edd-aa37-6c0cfd0d3eef\" (UID: \"399dde83-8f15-4edd-aa37-6c0cfd0d3eef\") " Dec 16 13:45:38 crc kubenswrapper[4757]: I1216 13:45:38.871279 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399dde83-8f15-4edd-aa37-6c0cfd0d3eef-utilities\") pod \"399dde83-8f15-4edd-aa37-6c0cfd0d3eef\" (UID: \"399dde83-8f15-4edd-aa37-6c0cfd0d3eef\") " Dec 16 13:45:38 crc kubenswrapper[4757]: I1216 13:45:38.871455 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s2bw\" (UniqueName: \"kubernetes.io/projected/399dde83-8f15-4edd-aa37-6c0cfd0d3eef-kube-api-access-2s2bw\") pod \"399dde83-8f15-4edd-aa37-6c0cfd0d3eef\" (UID: \"399dde83-8f15-4edd-aa37-6c0cfd0d3eef\") " Dec 16 13:45:38 crc kubenswrapper[4757]: I1216 13:45:38.873140 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/399dde83-8f15-4edd-aa37-6c0cfd0d3eef-utilities" (OuterVolumeSpecName: "utilities") pod "399dde83-8f15-4edd-aa37-6c0cfd0d3eef" (UID: "399dde83-8f15-4edd-aa37-6c0cfd0d3eef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:45:38 crc kubenswrapper[4757]: I1216 13:45:38.896159 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/399dde83-8f15-4edd-aa37-6c0cfd0d3eef-kube-api-access-2s2bw" (OuterVolumeSpecName: "kube-api-access-2s2bw") pod "399dde83-8f15-4edd-aa37-6c0cfd0d3eef" (UID: "399dde83-8f15-4edd-aa37-6c0cfd0d3eef"). InnerVolumeSpecName "kube-api-access-2s2bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:45:38 crc kubenswrapper[4757]: I1216 13:45:38.977428 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399dde83-8f15-4edd-aa37-6c0cfd0d3eef-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:45:38 crc kubenswrapper[4757]: I1216 13:45:38.977717 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s2bw\" (UniqueName: \"kubernetes.io/projected/399dde83-8f15-4edd-aa37-6c0cfd0d3eef-kube-api-access-2s2bw\") on node \"crc\" DevicePath \"\"" Dec 16 13:45:39 crc kubenswrapper[4757]: I1216 13:45:39.103081 4757 generic.go:334] "Generic (PLEG): container finished" podID="399dde83-8f15-4edd-aa37-6c0cfd0d3eef" containerID="9fea91e791e8f5b12e94dcc38dc9309017eea744f39ceb104cf3f91601a0ee21" exitCode=0 Dec 16 13:45:39 crc kubenswrapper[4757]: I1216 13:45:39.103313 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6rgr" event={"ID":"399dde83-8f15-4edd-aa37-6c0cfd0d3eef","Type":"ContainerDied","Data":"9fea91e791e8f5b12e94dcc38dc9309017eea744f39ceb104cf3f91601a0ee21"} Dec 16 13:45:39 crc kubenswrapper[4757]: I1216 13:45:39.103466 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6rgr" event={"ID":"399dde83-8f15-4edd-aa37-6c0cfd0d3eef","Type":"ContainerDied","Data":"586ed3c49a3b99c9c4d68ccab91ad29153bf4100fa9412bfaa0ba70027fa14a3"} Dec 16 13:45:39 crc kubenswrapper[4757]: I1216 13:45:39.103492 4757 scope.go:117] "RemoveContainer" containerID="9fea91e791e8f5b12e94dcc38dc9309017eea744f39ceb104cf3f91601a0ee21" Dec 16 13:45:39 crc kubenswrapper[4757]: I1216 13:45:39.104200 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6rgr" Dec 16 13:45:39 crc kubenswrapper[4757]: I1216 13:45:39.126907 4757 scope.go:117] "RemoveContainer" containerID="305fbbc309ad582fb0451d2459e30d202cbb2772fe15331f5ac79d744861c655" Dec 16 13:45:39 crc kubenswrapper[4757]: I1216 13:45:39.153614 4757 scope.go:117] "RemoveContainer" containerID="1ee77686609649abaae5345a4f4eee56ce0988caceb8732662709833b0c132d5" Dec 16 13:45:39 crc kubenswrapper[4757]: I1216 13:45:39.197841 4757 scope.go:117] "RemoveContainer" containerID="9fea91e791e8f5b12e94dcc38dc9309017eea744f39ceb104cf3f91601a0ee21" Dec 16 13:45:39 crc kubenswrapper[4757]: E1216 13:45:39.198412 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fea91e791e8f5b12e94dcc38dc9309017eea744f39ceb104cf3f91601a0ee21\": container with ID starting with 9fea91e791e8f5b12e94dcc38dc9309017eea744f39ceb104cf3f91601a0ee21 not found: ID does not exist" containerID="9fea91e791e8f5b12e94dcc38dc9309017eea744f39ceb104cf3f91601a0ee21" Dec 16 13:45:39 crc kubenswrapper[4757]: I1216 13:45:39.198452 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fea91e791e8f5b12e94dcc38dc9309017eea744f39ceb104cf3f91601a0ee21"} err="failed to get container status \"9fea91e791e8f5b12e94dcc38dc9309017eea744f39ceb104cf3f91601a0ee21\": rpc error: code = NotFound desc = could not find container \"9fea91e791e8f5b12e94dcc38dc9309017eea744f39ceb104cf3f91601a0ee21\": container with ID starting with 9fea91e791e8f5b12e94dcc38dc9309017eea744f39ceb104cf3f91601a0ee21 not found: ID does not exist" Dec 16 13:45:39 crc kubenswrapper[4757]: I1216 13:45:39.198482 4757 scope.go:117] "RemoveContainer" containerID="305fbbc309ad582fb0451d2459e30d202cbb2772fe15331f5ac79d744861c655" Dec 16 13:45:39 crc kubenswrapper[4757]: E1216 13:45:39.198820 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"305fbbc309ad582fb0451d2459e30d202cbb2772fe15331f5ac79d744861c655\": container with ID starting with 305fbbc309ad582fb0451d2459e30d202cbb2772fe15331f5ac79d744861c655 not found: ID does not exist" containerID="305fbbc309ad582fb0451d2459e30d202cbb2772fe15331f5ac79d744861c655" Dec 16 13:45:39 crc kubenswrapper[4757]: I1216 13:45:39.198847 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"305fbbc309ad582fb0451d2459e30d202cbb2772fe15331f5ac79d744861c655"} err="failed to get container status \"305fbbc309ad582fb0451d2459e30d202cbb2772fe15331f5ac79d744861c655\": rpc error: code = NotFound desc = could not find container \"305fbbc309ad582fb0451d2459e30d202cbb2772fe15331f5ac79d744861c655\": container with ID starting with 305fbbc309ad582fb0451d2459e30d202cbb2772fe15331f5ac79d744861c655 not found: ID does not exist" Dec 16 13:45:39 crc kubenswrapper[4757]: I1216 13:45:39.198864 4757 scope.go:117] "RemoveContainer" containerID="1ee77686609649abaae5345a4f4eee56ce0988caceb8732662709833b0c132d5" Dec 16 13:45:39 crc kubenswrapper[4757]: E1216 13:45:39.199073 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ee77686609649abaae5345a4f4eee56ce0988caceb8732662709833b0c132d5\": container with ID starting with 1ee77686609649abaae5345a4f4eee56ce0988caceb8732662709833b0c132d5 not found: ID does not exist" containerID="1ee77686609649abaae5345a4f4eee56ce0988caceb8732662709833b0c132d5" Dec 16 13:45:39 crc kubenswrapper[4757]: I1216 13:45:39.199098 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ee77686609649abaae5345a4f4eee56ce0988caceb8732662709833b0c132d5"} err="failed to get container status \"1ee77686609649abaae5345a4f4eee56ce0988caceb8732662709833b0c132d5\": rpc error: code = NotFound desc = could not find container \"1ee77686609649abaae5345a4f4eee56ce0988caceb8732662709833b0c132d5\": container with ID starting with 1ee77686609649abaae5345a4f4eee56ce0988caceb8732662709833b0c132d5 not found: ID does not exist" Dec 16 13:45:39 crc kubenswrapper[4757]: I1216 13:45:39.602000 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/399dde83-8f15-4edd-aa37-6c0cfd0d3eef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "399dde83-8f15-4edd-aa37-6c0cfd0d3eef" (UID: "399dde83-8f15-4edd-aa37-6c0cfd0d3eef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:45:39 crc kubenswrapper[4757]: I1216 13:45:39.696733 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399dde83-8f15-4edd-aa37-6c0cfd0d3eef-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:45:39 crc kubenswrapper[4757]: I1216 13:45:39.745242 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s6rgr"] Dec 16 13:45:39 crc kubenswrapper[4757]: I1216 13:45:39.757051 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s6rgr"] Dec 16 13:45:40 crc kubenswrapper[4757]: I1216 13:45:40.963828 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="399dde83-8f15-4edd-aa37-6c0cfd0d3eef" path="/var/lib/kubelet/pods/399dde83-8f15-4edd-aa37-6c0cfd0d3eef/volumes" Dec 16 13:45:51 crc kubenswrapper[4757]: I1216 13:45:51.182345 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:45:51 crc kubenswrapper[4757]: I1216 13:45:51.182938 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:46:21 crc kubenswrapper[4757]: I1216 13:46:21.181904 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:46:21 crc kubenswrapper[4757]: I1216 13:46:21.182889 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:46:21 crc kubenswrapper[4757]: I1216 13:46:21.182957 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 13:46:21 crc kubenswrapper[4757]: I1216 13:46:21.184177 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 13:46:21 crc kubenswrapper[4757]: I1216 13:46:21.184247 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" gracePeriod=600 Dec 16 13:46:21 crc kubenswrapper[4757]: I1216 13:46:21.482922 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" exitCode=0 Dec 16 13:46:21 crc kubenswrapper[4757]: I1216 13:46:21.482966 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a"} Dec 16 13:46:21 crc kubenswrapper[4757]: I1216 13:46:21.482998 4757 scope.go:117] "RemoveContainer" containerID="6ba36603b2ede44048e78f416348abfebbffb6f1388fd687a5fe481e6d72c629" Dec 16 13:46:22 crc kubenswrapper[4757]: E1216 13:46:22.096559 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:46:22 crc kubenswrapper[4757]: I1216 13:46:22.492902 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:46:22 crc kubenswrapper[4757]: E1216 13:46:22.493208 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:46:33 crc kubenswrapper[4757]: I1216 13:46:33.949161 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:46:33 crc kubenswrapper[4757]: E1216 13:46:33.950089 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:46:44 crc kubenswrapper[4757]: I1216 13:46:44.958741 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:46:44 crc kubenswrapper[4757]: E1216 13:46:44.959617 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:46:57 crc kubenswrapper[4757]: I1216 13:46:57.948726 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:46:57 crc kubenswrapper[4757]: E1216 13:46:57.950022 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:47:10 crc kubenswrapper[4757]: I1216 13:47:10.950062 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:47:10 crc kubenswrapper[4757]: E1216 13:47:10.950826 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:47:24 crc kubenswrapper[4757]: I1216 13:47:24.957346 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:47:24 crc kubenswrapper[4757]: E1216 13:47:24.959262 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:47:36 crc kubenswrapper[4757]: I1216 13:47:36.949228 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:47:36 crc kubenswrapper[4757]: E1216 13:47:36.951061 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:47:47 crc kubenswrapper[4757]: I1216 13:47:47.949171 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:47:47 crc kubenswrapper[4757]: E1216 13:47:47.950014 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:48:02 crc kubenswrapper[4757]: I1216 13:48:02.950079 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:48:02 crc kubenswrapper[4757]: E1216 13:48:02.952175 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:48:15 crc kubenswrapper[4757]: I1216 13:48:15.949345 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:48:15 crc kubenswrapper[4757]: E1216 13:48:15.951761 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:48:29 crc kubenswrapper[4757]: I1216 13:48:29.949209 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:48:29 crc kubenswrapper[4757]: E1216 13:48:29.949938 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:48:42 crc kubenswrapper[4757]: I1216 13:48:42.948832 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:48:42 crc kubenswrapper[4757]: E1216 13:48:42.949809 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:48:55 crc kubenswrapper[4757]: I1216 13:48:55.949176 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:48:55 crc kubenswrapper[4757]: E1216 13:48:55.950105 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:49:07 crc kubenswrapper[4757]: I1216 13:49:07.949261 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:49:07 crc kubenswrapper[4757]: E1216 13:49:07.952344 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:49:20 crc kubenswrapper[4757]: I1216 13:49:20.949183 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:49:20 crc kubenswrapper[4757]: E1216 13:49:20.950067 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:49:32 crc kubenswrapper[4757]: I1216 13:49:32.948549 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:49:32 crc kubenswrapper[4757]: E1216 13:49:32.949192 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:49:43 crc kubenswrapper[4757]: I1216 13:49:43.949358 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:49:43 crc kubenswrapper[4757]: E1216 13:49:43.950235 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:49:55 crc kubenswrapper[4757]: I1216 13:49:55.949282 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:49:55 crc kubenswrapper[4757]: E1216 13:49:55.950739 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:50:07 crc kubenswrapper[4757]: I1216 13:50:07.949125 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:50:07 crc kubenswrapper[4757]: E1216 13:50:07.949888 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:50:20 crc kubenswrapper[4757]: I1216 13:50:20.950297 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:50:20 crc kubenswrapper[4757]: E1216 13:50:20.951211 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:50:32 crc kubenswrapper[4757]: I1216 13:50:32.950534 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:50:32 crc kubenswrapper[4757]: E1216 13:50:32.951446 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:50:47 crc kubenswrapper[4757]: I1216 13:50:47.949154 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:50:47 crc kubenswrapper[4757]: E1216 13:50:47.950057 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:50:58 crc kubenswrapper[4757]: I1216 13:50:58.949649 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:50:58 crc kubenswrapper[4757]: E1216 13:50:58.950538 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:51:12 crc kubenswrapper[4757]: I1216 13:51:12.950056 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:51:12 crc kubenswrapper[4757]: E1216 13:51:12.950838 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:51:27 crc kubenswrapper[4757]: I1216 13:51:27.951292 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:51:30 crc kubenswrapper[4757]: I1216 13:51:30.275976 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"fc2b581c5d760a9f331945725f63ec478a166eec486ac29ae73884d9107add8d"} Dec 16 13:51:54 crc kubenswrapper[4757]: I1216 13:51:54.995168 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j9l6b"] Dec 16 13:51:54 crc kubenswrapper[4757]: E1216 13:51:54.996288 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399dde83-8f15-4edd-aa37-6c0cfd0d3eef" containerName="extract-utilities" Dec 16 13:51:54 crc kubenswrapper[4757]: I1216 13:51:54.996305 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="399dde83-8f15-4edd-aa37-6c0cfd0d3eef" containerName="extract-utilities" Dec 16 13:51:54 crc kubenswrapper[4757]: E1216 13:51:54.996319 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399dde83-8f15-4edd-aa37-6c0cfd0d3eef" containerName="extract-content" Dec 16 13:51:54 crc kubenswrapper[4757]: I1216 13:51:54.996327 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="399dde83-8f15-4edd-aa37-6c0cfd0d3eef" containerName="extract-content" Dec 16 13:51:54 crc kubenswrapper[4757]: E1216 13:51:54.996357 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399dde83-8f15-4edd-aa37-6c0cfd0d3eef" containerName="registry-server" Dec 16 13:51:54 crc kubenswrapper[4757]: I1216 13:51:54.996365 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="399dde83-8f15-4edd-aa37-6c0cfd0d3eef" containerName="registry-server" Dec 16 13:51:54 crc kubenswrapper[4757]: I1216 13:51:54.996613 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="399dde83-8f15-4edd-aa37-6c0cfd0d3eef" containerName="registry-server" Dec 16 13:51:54 crc kubenswrapper[4757]: I1216 13:51:54.998290 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9l6b" Dec 16 13:51:55 crc kubenswrapper[4757]: I1216 13:51:55.006722 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j9l6b"] Dec 16 13:51:55 crc kubenswrapper[4757]: I1216 13:51:55.143555 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2db54f-9491-4703-b6a2-3ff59fa4ca50-utilities\") pod \"certified-operators-j9l6b\" (UID: \"eb2db54f-9491-4703-b6a2-3ff59fa4ca50\") " pod="openshift-marketplace/certified-operators-j9l6b" Dec 16 13:51:55 crc kubenswrapper[4757]: I1216 13:51:55.143689 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2db54f-9491-4703-b6a2-3ff59fa4ca50-catalog-content\") pod \"certified-operators-j9l6b\" (UID: \"eb2db54f-9491-4703-b6a2-3ff59fa4ca50\") " pod="openshift-marketplace/certified-operators-j9l6b" Dec 16 13:51:55 crc kubenswrapper[4757]: I1216 13:51:55.143839 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdmcs\" (UniqueName: \"kubernetes.io/projected/eb2db54f-9491-4703-b6a2-3ff59fa4ca50-kube-api-access-cdmcs\") pod \"certified-operators-j9l6b\" (UID: \"eb2db54f-9491-4703-b6a2-3ff59fa4ca50\") " pod="openshift-marketplace/certified-operators-j9l6b" Dec 16 13:51:55 crc kubenswrapper[4757]: I1216 13:51:55.245547 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdmcs\" (UniqueName: \"kubernetes.io/projected/eb2db54f-9491-4703-b6a2-3ff59fa4ca50-kube-api-access-cdmcs\") pod \"certified-operators-j9l6b\" (UID: \"eb2db54f-9491-4703-b6a2-3ff59fa4ca50\") " pod="openshift-marketplace/certified-operators-j9l6b" Dec 16 13:51:55 crc kubenswrapper[4757]: I1216 13:51:55.245647 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2db54f-9491-4703-b6a2-3ff59fa4ca50-utilities\") pod \"certified-operators-j9l6b\" (UID: \"eb2db54f-9491-4703-b6a2-3ff59fa4ca50\") " pod="openshift-marketplace/certified-operators-j9l6b" Dec 16 13:51:55 crc kubenswrapper[4757]: I1216 13:51:55.245710 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2db54f-9491-4703-b6a2-3ff59fa4ca50-catalog-content\") pod \"certified-operators-j9l6b\" (UID: \"eb2db54f-9491-4703-b6a2-3ff59fa4ca50\") " pod="openshift-marketplace/certified-operators-j9l6b" Dec 16 13:51:55 crc kubenswrapper[4757]: I1216 13:51:55.246190 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2db54f-9491-4703-b6a2-3ff59fa4ca50-utilities\") pod \"certified-operators-j9l6b\" (UID: \"eb2db54f-9491-4703-b6a2-3ff59fa4ca50\") " pod="openshift-marketplace/certified-operators-j9l6b" Dec 16 13:51:55 crc kubenswrapper[4757]: I1216 13:51:55.246259 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2db54f-9491-4703-b6a2-3ff59fa4ca50-catalog-content\") pod \"certified-operators-j9l6b\" (UID: \"eb2db54f-9491-4703-b6a2-3ff59fa4ca50\") " pod="openshift-marketplace/certified-operators-j9l6b" Dec 16 13:51:55 crc kubenswrapper[4757]: I1216 13:51:55.268321 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdmcs\" (UniqueName: \"kubernetes.io/projected/eb2db54f-9491-4703-b6a2-3ff59fa4ca50-kube-api-access-cdmcs\") pod \"certified-operators-j9l6b\" (UID: \"eb2db54f-9491-4703-b6a2-3ff59fa4ca50\") " pod="openshift-marketplace/certified-operators-j9l6b" Dec 16 13:51:55 crc kubenswrapper[4757]: I1216 13:51:55.342782 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9l6b" Dec 16 13:51:55 crc kubenswrapper[4757]: I1216 13:51:55.959348 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j9l6b"] Dec 16 13:51:56 crc kubenswrapper[4757]: I1216 13:51:56.506247 4757 generic.go:334] "Generic (PLEG): container finished" podID="eb2db54f-9491-4703-b6a2-3ff59fa4ca50" containerID="5dc680b7d4f8aec2306f322ee89994c9909c872fd63d845810ca8770c3dccd1a" exitCode=0 Dec 16 13:51:56 crc kubenswrapper[4757]: I1216 13:51:56.506316 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9l6b" event={"ID":"eb2db54f-9491-4703-b6a2-3ff59fa4ca50","Type":"ContainerDied","Data":"5dc680b7d4f8aec2306f322ee89994c9909c872fd63d845810ca8770c3dccd1a"} Dec 16 13:51:56 crc kubenswrapper[4757]: I1216 13:51:56.506354 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9l6b" event={"ID":"eb2db54f-9491-4703-b6a2-3ff59fa4ca50","Type":"ContainerStarted","Data":"dbf457f63bf6d5d1f851421b4992920a5a9910ec3ce758af3beadaad93cd42fb"} Dec 16 13:51:56 crc kubenswrapper[4757]: I1216 13:51:56.508264 4757 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 13:51:58 crc kubenswrapper[4757]: I1216 13:51:58.524767 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9l6b" event={"ID":"eb2db54f-9491-4703-b6a2-3ff59fa4ca50","Type":"ContainerStarted","Data":"5a9105cabe4d58336a90ce016e9dcdeccaa16db6a9f9be6a64bd7c21864738a9"} Dec 16 13:52:00 crc kubenswrapper[4757]: I1216 13:52:00.541172 4757 generic.go:334] "Generic (PLEG): container finished" podID="eb2db54f-9491-4703-b6a2-3ff59fa4ca50" containerID="5a9105cabe4d58336a90ce016e9dcdeccaa16db6a9f9be6a64bd7c21864738a9" exitCode=0 Dec 16 13:52:00 crc kubenswrapper[4757]: I1216 13:52:00.541224 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9l6b" event={"ID":"eb2db54f-9491-4703-b6a2-3ff59fa4ca50","Type":"ContainerDied","Data":"5a9105cabe4d58336a90ce016e9dcdeccaa16db6a9f9be6a64bd7c21864738a9"} Dec 16 13:52:01 crc kubenswrapper[4757]: I1216 13:52:01.552135 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9l6b" event={"ID":"eb2db54f-9491-4703-b6a2-3ff59fa4ca50","Type":"ContainerStarted","Data":"aa293da1682261d3dc16b361b571e878c35fe3b8b412bf47bc100098f363b6b5"} Dec 16 13:52:01 crc kubenswrapper[4757]: I1216 13:52:01.575387 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j9l6b" podStartSLOduration=2.972977168 podStartE2EDuration="7.575367968s" podCreationTimestamp="2025-12-16 13:51:54 +0000 UTC" firstStartedPulling="2025-12-16 13:51:56.507884953 +0000 UTC m=+3901.935628749" lastFinishedPulling="2025-12-16 13:52:01.110275753 +0000 UTC m=+3906.538019549" observedRunningTime="2025-12-16 13:52:01.569963383 +0000 UTC m=+3906.997707189" watchObservedRunningTime="2025-12-16 13:52:01.575367968 +0000 UTC m=+3907.003111764" Dec 16 13:52:05 crc kubenswrapper[4757]: I1216 13:52:05.342956 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j9l6b" Dec 16 13:52:05 crc kubenswrapper[4757]: I1216 13:52:05.343554 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j9l6b" Dec 16 13:52:05 crc kubenswrapper[4757]: I1216 13:52:05.408618 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j9l6b" Dec 16 13:52:15 crc kubenswrapper[4757]: I1216 13:52:15.404112 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j9l6b" Dec 16 13:52:15 crc kubenswrapper[4757]: I1216 13:52:15.465195 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j9l6b"] Dec 16 13:52:15 crc kubenswrapper[4757]: I1216 13:52:15.679180 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j9l6b" podUID="eb2db54f-9491-4703-b6a2-3ff59fa4ca50" containerName="registry-server" containerID="cri-o://aa293da1682261d3dc16b361b571e878c35fe3b8b412bf47bc100098f363b6b5" gracePeriod=2 Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.344698 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9l6b" Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.436937 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdmcs\" (UniqueName: \"kubernetes.io/projected/eb2db54f-9491-4703-b6a2-3ff59fa4ca50-kube-api-access-cdmcs\") pod \"eb2db54f-9491-4703-b6a2-3ff59fa4ca50\" (UID: \"eb2db54f-9491-4703-b6a2-3ff59fa4ca50\") " Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.437035 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2db54f-9491-4703-b6a2-3ff59fa4ca50-utilities\") pod \"eb2db54f-9491-4703-b6a2-3ff59fa4ca50\" (UID: \"eb2db54f-9491-4703-b6a2-3ff59fa4ca50\") " Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.437172 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2db54f-9491-4703-b6a2-3ff59fa4ca50-catalog-content\") pod \"eb2db54f-9491-4703-b6a2-3ff59fa4ca50\" (UID: \"eb2db54f-9491-4703-b6a2-3ff59fa4ca50\") " Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.438053 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb2db54f-9491-4703-b6a2-3ff59fa4ca50-utilities" (OuterVolumeSpecName: "utilities") pod "eb2db54f-9491-4703-b6a2-3ff59fa4ca50" (UID: "eb2db54f-9491-4703-b6a2-3ff59fa4ca50"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.443393 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb2db54f-9491-4703-b6a2-3ff59fa4ca50-kube-api-access-cdmcs" (OuterVolumeSpecName: "kube-api-access-cdmcs") pod "eb2db54f-9491-4703-b6a2-3ff59fa4ca50" (UID: "eb2db54f-9491-4703-b6a2-3ff59fa4ca50"). InnerVolumeSpecName "kube-api-access-cdmcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.498362 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb2db54f-9491-4703-b6a2-3ff59fa4ca50-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb2db54f-9491-4703-b6a2-3ff59fa4ca50" (UID: "eb2db54f-9491-4703-b6a2-3ff59fa4ca50"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.539234 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2db54f-9491-4703-b6a2-3ff59fa4ca50-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.539262 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2db54f-9491-4703-b6a2-3ff59fa4ca50-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.539273 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdmcs\" (UniqueName: \"kubernetes.io/projected/eb2db54f-9491-4703-b6a2-3ff59fa4ca50-kube-api-access-cdmcs\") on node \"crc\" DevicePath \"\"" Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.689961 4757 generic.go:334] "Generic (PLEG): container finished" podID="eb2db54f-9491-4703-b6a2-3ff59fa4ca50" containerID="aa293da1682261d3dc16b361b571e878c35fe3b8b412bf47bc100098f363b6b5" exitCode=0 Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.690030 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9l6b" event={"ID":"eb2db54f-9491-4703-b6a2-3ff59fa4ca50","Type":"ContainerDied","Data":"aa293da1682261d3dc16b361b571e878c35fe3b8b412bf47bc100098f363b6b5"} Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.690064 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9l6b" event={"ID":"eb2db54f-9491-4703-b6a2-3ff59fa4ca50","Type":"ContainerDied","Data":"dbf457f63bf6d5d1f851421b4992920a5a9910ec3ce758af3beadaad93cd42fb"} Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.690086 4757 scope.go:117] "RemoveContainer" containerID="aa293da1682261d3dc16b361b571e878c35fe3b8b412bf47bc100098f363b6b5" Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.690241 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9l6b" Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.730962 4757 scope.go:117] "RemoveContainer" containerID="5a9105cabe4d58336a90ce016e9dcdeccaa16db6a9f9be6a64bd7c21864738a9" Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.744237 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j9l6b"] Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.756058 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j9l6b"] Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.770694 4757 scope.go:117] "RemoveContainer" containerID="5dc680b7d4f8aec2306f322ee89994c9909c872fd63d845810ca8770c3dccd1a" Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.827374 4757 scope.go:117] "RemoveContainer" containerID="aa293da1682261d3dc16b361b571e878c35fe3b8b412bf47bc100098f363b6b5" Dec 16 13:52:16 crc kubenswrapper[4757]: E1216 13:52:16.827832 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa293da1682261d3dc16b361b571e878c35fe3b8b412bf47bc100098f363b6b5\": container with ID starting with aa293da1682261d3dc16b361b571e878c35fe3b8b412bf47bc100098f363b6b5 not found: ID does not exist" containerID="aa293da1682261d3dc16b361b571e878c35fe3b8b412bf47bc100098f363b6b5" Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.827884 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa293da1682261d3dc16b361b571e878c35fe3b8b412bf47bc100098f363b6b5"} err="failed to get container status \"aa293da1682261d3dc16b361b571e878c35fe3b8b412bf47bc100098f363b6b5\": rpc error: code = NotFound desc = could not find container \"aa293da1682261d3dc16b361b571e878c35fe3b8b412bf47bc100098f363b6b5\": container with ID starting with aa293da1682261d3dc16b361b571e878c35fe3b8b412bf47bc100098f363b6b5 not found: ID does not exist" Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.827914 4757 scope.go:117] "RemoveContainer" containerID="5a9105cabe4d58336a90ce016e9dcdeccaa16db6a9f9be6a64bd7c21864738a9" Dec 16 13:52:16 crc kubenswrapper[4757]: E1216 13:52:16.828281 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a9105cabe4d58336a90ce016e9dcdeccaa16db6a9f9be6a64bd7c21864738a9\": container with ID starting with 5a9105cabe4d58336a90ce016e9dcdeccaa16db6a9f9be6a64bd7c21864738a9 not found: ID does not exist" containerID="5a9105cabe4d58336a90ce016e9dcdeccaa16db6a9f9be6a64bd7c21864738a9" Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.828314 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a9105cabe4d58336a90ce016e9dcdeccaa16db6a9f9be6a64bd7c21864738a9"} err="failed to get container status \"5a9105cabe4d58336a90ce016e9dcdeccaa16db6a9f9be6a64bd7c21864738a9\": rpc error: code = NotFound desc = could not find container \"5a9105cabe4d58336a90ce016e9dcdeccaa16db6a9f9be6a64bd7c21864738a9\": container with ID starting with 5a9105cabe4d58336a90ce016e9dcdeccaa16db6a9f9be6a64bd7c21864738a9 not found: ID does not exist" Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.828351 4757 scope.go:117] "RemoveContainer" containerID="5dc680b7d4f8aec2306f322ee89994c9909c872fd63d845810ca8770c3dccd1a" Dec 16 13:52:16 crc kubenswrapper[4757]: E1216 13:52:16.828630 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dc680b7d4f8aec2306f322ee89994c9909c872fd63d845810ca8770c3dccd1a\": container with ID starting with 5dc680b7d4f8aec2306f322ee89994c9909c872fd63d845810ca8770c3dccd1a not found: ID does not exist" containerID="5dc680b7d4f8aec2306f322ee89994c9909c872fd63d845810ca8770c3dccd1a" Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.828649 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dc680b7d4f8aec2306f322ee89994c9909c872fd63d845810ca8770c3dccd1a"} err="failed to get container status \"5dc680b7d4f8aec2306f322ee89994c9909c872fd63d845810ca8770c3dccd1a\": rpc error: code = NotFound desc = could not find container \"5dc680b7d4f8aec2306f322ee89994c9909c872fd63d845810ca8770c3dccd1a\": container with ID starting with 5dc680b7d4f8aec2306f322ee89994c9909c872fd63d845810ca8770c3dccd1a not found: ID does not exist" Dec 16 13:52:16 crc kubenswrapper[4757]: I1216 13:52:16.961528 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb2db54f-9491-4703-b6a2-3ff59fa4ca50" path="/var/lib/kubelet/pods/eb2db54f-9491-4703-b6a2-3ff59fa4ca50/volumes" Dec 16 13:53:51 crc kubenswrapper[4757]: I1216 13:53:51.181291 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:53:51 crc kubenswrapper[4757]: I1216 13:53:51.181917 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:54:17 crc kubenswrapper[4757]: I1216 13:54:17.990064 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bk524"] Dec 16 13:54:17 crc kubenswrapper[4757]: E1216 13:54:17.991122 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2db54f-9491-4703-b6a2-3ff59fa4ca50" containerName="extract-content" Dec 16 13:54:17 crc kubenswrapper[4757]: I1216 13:54:17.991139 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2db54f-9491-4703-b6a2-3ff59fa4ca50" containerName="extract-content" Dec 16 13:54:17 crc kubenswrapper[4757]: E1216 13:54:17.991170 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2db54f-9491-4703-b6a2-3ff59fa4ca50" containerName="extract-utilities" Dec 16 13:54:17 crc kubenswrapper[4757]: I1216 13:54:17.991178 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2db54f-9491-4703-b6a2-3ff59fa4ca50" containerName="extract-utilities" Dec 16 13:54:17 crc kubenswrapper[4757]: E1216 13:54:17.991208 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2db54f-9491-4703-b6a2-3ff59fa4ca50" containerName="registry-server" Dec 16 13:54:17 crc kubenswrapper[4757]: I1216 13:54:17.991216 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2db54f-9491-4703-b6a2-3ff59fa4ca50" containerName="registry-server" Dec 16 13:54:17 crc kubenswrapper[4757]: I1216 13:54:17.991446 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb2db54f-9491-4703-b6a2-3ff59fa4ca50" containerName="registry-server" Dec 16 13:54:17 crc kubenswrapper[4757]: I1216 13:54:17.993430 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bk524" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.024900 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bk524"] Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.112462 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0e3573-30b8-4ad5-b802-16ae7e1dd027-utilities\") pod \"redhat-operators-bk524\" (UID: \"8f0e3573-30b8-4ad5-b802-16ae7e1dd027\") " pod="openshift-marketplace/redhat-operators-bk524" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.112608 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0e3573-30b8-4ad5-b802-16ae7e1dd027-catalog-content\") pod \"redhat-operators-bk524\" (UID: \"8f0e3573-30b8-4ad5-b802-16ae7e1dd027\") " pod="openshift-marketplace/redhat-operators-bk524" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.112728 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvdsb\" (UniqueName: \"kubernetes.io/projected/8f0e3573-30b8-4ad5-b802-16ae7e1dd027-kube-api-access-cvdsb\") pod \"redhat-operators-bk524\" (UID: \"8f0e3573-30b8-4ad5-b802-16ae7e1dd027\") " pod="openshift-marketplace/redhat-operators-bk524" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.214472 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0e3573-30b8-4ad5-b802-16ae7e1dd027-catalog-content\") pod \"redhat-operators-bk524\" (UID: \"8f0e3573-30b8-4ad5-b802-16ae7e1dd027\") " pod="openshift-marketplace/redhat-operators-bk524" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.214539 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvdsb\" (UniqueName: \"kubernetes.io/projected/8f0e3573-30b8-4ad5-b802-16ae7e1dd027-kube-api-access-cvdsb\") pod \"redhat-operators-bk524\" (UID: \"8f0e3573-30b8-4ad5-b802-16ae7e1dd027\") " pod="openshift-marketplace/redhat-operators-bk524" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.214625 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0e3573-30b8-4ad5-b802-16ae7e1dd027-utilities\") pod \"redhat-operators-bk524\" (UID: \"8f0e3573-30b8-4ad5-b802-16ae7e1dd027\") " pod="openshift-marketplace/redhat-operators-bk524" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.215174 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0e3573-30b8-4ad5-b802-16ae7e1dd027-utilities\") pod \"redhat-operators-bk524\" (UID: \"8f0e3573-30b8-4ad5-b802-16ae7e1dd027\") " pod="openshift-marketplace/redhat-operators-bk524" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.215436 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0e3573-30b8-4ad5-b802-16ae7e1dd027-catalog-content\") pod \"redhat-operators-bk524\" (UID: \"8f0e3573-30b8-4ad5-b802-16ae7e1dd027\") " pod="openshift-marketplace/redhat-operators-bk524" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.254995 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvdsb\" (UniqueName: \"kubernetes.io/projected/8f0e3573-30b8-4ad5-b802-16ae7e1dd027-kube-api-access-cvdsb\") pod \"redhat-operators-bk524\" (UID: \"8f0e3573-30b8-4ad5-b802-16ae7e1dd027\") " pod="openshift-marketplace/redhat-operators-bk524" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.314982 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bk524" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.630901 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z89sv"] Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.656651 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z89sv" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.678268 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z89sv"] Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.741458 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvg4f\" (UniqueName: \"kubernetes.io/projected/bc70b314-f05d-40d2-ac58-d73b18bde652-kube-api-access-fvg4f\") pod \"redhat-marketplace-z89sv\" (UID: \"bc70b314-f05d-40d2-ac58-d73b18bde652\") " pod="openshift-marketplace/redhat-marketplace-z89sv" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.741826 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc70b314-f05d-40d2-ac58-d73b18bde652-utilities\") pod \"redhat-marketplace-z89sv\" (UID: \"bc70b314-f05d-40d2-ac58-d73b18bde652\") " pod="openshift-marketplace/redhat-marketplace-z89sv" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.741925 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc70b314-f05d-40d2-ac58-d73b18bde652-catalog-content\") pod \"redhat-marketplace-z89sv\" (UID: \"bc70b314-f05d-40d2-ac58-d73b18bde652\") " pod="openshift-marketplace/redhat-marketplace-z89sv" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.843909 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc70b314-f05d-40d2-ac58-d73b18bde652-utilities\") pod \"redhat-marketplace-z89sv\" (UID: \"bc70b314-f05d-40d2-ac58-d73b18bde652\") " pod="openshift-marketplace/redhat-marketplace-z89sv" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.844062 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc70b314-f05d-40d2-ac58-d73b18bde652-catalog-content\") pod \"redhat-marketplace-z89sv\" (UID: \"bc70b314-f05d-40d2-ac58-d73b18bde652\") " pod="openshift-marketplace/redhat-marketplace-z89sv" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.844144 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvg4f\" (UniqueName: \"kubernetes.io/projected/bc70b314-f05d-40d2-ac58-d73b18bde652-kube-api-access-fvg4f\") pod \"redhat-marketplace-z89sv\" (UID: \"bc70b314-f05d-40d2-ac58-d73b18bde652\") " pod="openshift-marketplace/redhat-marketplace-z89sv" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.845267 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc70b314-f05d-40d2-ac58-d73b18bde652-catalog-content\") pod \"redhat-marketplace-z89sv\" (UID: \"bc70b314-f05d-40d2-ac58-d73b18bde652\") " pod="openshift-marketplace/redhat-marketplace-z89sv" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.845690 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc70b314-f05d-40d2-ac58-d73b18bde652-utilities\") pod \"redhat-marketplace-z89sv\" (UID: \"bc70b314-f05d-40d2-ac58-d73b18bde652\") " pod="openshift-marketplace/redhat-marketplace-z89sv" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.854392 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bk524"] Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.889538 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvg4f\" (UniqueName: \"kubernetes.io/projected/bc70b314-f05d-40d2-ac58-d73b18bde652-kube-api-access-fvg4f\") pod \"redhat-marketplace-z89sv\" (UID: \"bc70b314-f05d-40d2-ac58-d73b18bde652\") " pod="openshift-marketplace/redhat-marketplace-z89sv" Dec 16 13:54:18 crc kubenswrapper[4757]: I1216 13:54:18.999798 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z89sv" Dec 16 13:54:19 crc kubenswrapper[4757]: I1216 13:54:19.546195 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z89sv"] Dec 16 13:54:19 crc kubenswrapper[4757]: I1216 13:54:19.815515 4757 generic.go:334] "Generic (PLEG): container finished" podID="8f0e3573-30b8-4ad5-b802-16ae7e1dd027" containerID="20519d695f2a451fbecab521f5e0d68c9d3762c2ee65b63de0bfc43009c1a32c" exitCode=0 Dec 16 13:54:19 crc kubenswrapper[4757]: I1216 13:54:19.815567 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk524" event={"ID":"8f0e3573-30b8-4ad5-b802-16ae7e1dd027","Type":"ContainerDied","Data":"20519d695f2a451fbecab521f5e0d68c9d3762c2ee65b63de0bfc43009c1a32c"} Dec 16 13:54:19 crc kubenswrapper[4757]: I1216 13:54:19.815613 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk524" event={"ID":"8f0e3573-30b8-4ad5-b802-16ae7e1dd027","Type":"ContainerStarted","Data":"be57067bfc230eb38f19deed7b35d056ada2805b1705471f19a81ca792217c47"} Dec 16 13:54:20 crc kubenswrapper[4757]: W1216 13:54:20.000176 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc70b314_f05d_40d2_ac58_d73b18bde652.slice/crio-af3776a9e7b2e7ced9e25daf3a2089b52ca8453f0791defeae162aed707b7201 WatchSource:0}: Error finding container af3776a9e7b2e7ced9e25daf3a2089b52ca8453f0791defeae162aed707b7201: Status 404 returned error can't find the container with id af3776a9e7b2e7ced9e25daf3a2089b52ca8453f0791defeae162aed707b7201 Dec 16 13:54:20 crc kubenswrapper[4757]: I1216 13:54:20.828335 4757 generic.go:334] "Generic (PLEG): container finished" podID="bc70b314-f05d-40d2-ac58-d73b18bde652" containerID="6cac8fb7b77faea4e3a9731d6ff34a92df5c0662f3f73d234c24f6de2b5c4abe" exitCode=0 Dec 16 13:54:20 crc kubenswrapper[4757]: I1216 13:54:20.828400 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z89sv" event={"ID":"bc70b314-f05d-40d2-ac58-d73b18bde652","Type":"ContainerDied","Data":"6cac8fb7b77faea4e3a9731d6ff34a92df5c0662f3f73d234c24f6de2b5c4abe"} Dec 16 13:54:20 crc kubenswrapper[4757]: I1216 13:54:20.829375 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z89sv" event={"ID":"bc70b314-f05d-40d2-ac58-d73b18bde652","Type":"ContainerStarted","Data":"af3776a9e7b2e7ced9e25daf3a2089b52ca8453f0791defeae162aed707b7201"} Dec 16 13:54:21 crc kubenswrapper[4757]: I1216 13:54:21.181193 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:54:21 crc kubenswrapper[4757]: I1216 13:54:21.181264 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:54:21 crc kubenswrapper[4757]: I1216 13:54:21.846589 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk524" event={"ID":"8f0e3573-30b8-4ad5-b802-16ae7e1dd027","Type":"ContainerStarted","Data":"2384b0305eb5a24ea286f66872cee3399e20bddf9a20dcd792a77a2f3c2a64bf"} Dec 16 13:54:22 crc kubenswrapper[4757]: I1216 13:54:22.861433 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z89sv" event={"ID":"bc70b314-f05d-40d2-ac58-d73b18bde652","Type":"ContainerStarted","Data":"dfaee799d5c0df85b868b9360eaf8d7ef5e864d0dfee5a110eb902ef080dec8a"} Dec 16 13:54:24 crc kubenswrapper[4757]: I1216 13:54:24.880976 4757 generic.go:334] "Generic (PLEG): container finished" podID="bc70b314-f05d-40d2-ac58-d73b18bde652" containerID="dfaee799d5c0df85b868b9360eaf8d7ef5e864d0dfee5a110eb902ef080dec8a" exitCode=0 Dec 16 13:54:24 crc kubenswrapper[4757]: I1216 13:54:24.881051 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z89sv" event={"ID":"bc70b314-f05d-40d2-ac58-d73b18bde652","Type":"ContainerDied","Data":"dfaee799d5c0df85b868b9360eaf8d7ef5e864d0dfee5a110eb902ef080dec8a"} Dec 16 13:54:25 crc kubenswrapper[4757]: I1216 13:54:25.894460 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z89sv" event={"ID":"bc70b314-f05d-40d2-ac58-d73b18bde652","Type":"ContainerStarted","Data":"9a3cfe61b28c8ae7e96c00d00f9606643abf3f3b08138db757128e8bc4447347"} Dec 16 13:54:25 crc kubenswrapper[4757]: I1216 13:54:25.923844 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z89sv" podStartSLOduration=3.401208601 podStartE2EDuration="7.923823454s" podCreationTimestamp="2025-12-16 13:54:18 +0000 UTC" firstStartedPulling="2025-12-16 13:54:20.829901352 +0000 UTC m=+4046.257645148" lastFinishedPulling="2025-12-16 13:54:25.352516205 +0000 UTC m=+4050.780260001" observedRunningTime="2025-12-16 13:54:25.920977584 +0000 UTC m=+4051.348721380" watchObservedRunningTime="2025-12-16 13:54:25.923823454 +0000 UTC m=+4051.351567260" Dec 16 13:54:26 crc kubenswrapper[4757]: I1216 13:54:26.905793 4757 generic.go:334] "Generic (PLEG): container finished" podID="8f0e3573-30b8-4ad5-b802-16ae7e1dd027" containerID="2384b0305eb5a24ea286f66872cee3399e20bddf9a20dcd792a77a2f3c2a64bf" exitCode=0 Dec 16 13:54:26 crc kubenswrapper[4757]: I1216 13:54:26.905977 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk524" event={"ID":"8f0e3573-30b8-4ad5-b802-16ae7e1dd027","Type":"ContainerDied","Data":"2384b0305eb5a24ea286f66872cee3399e20bddf9a20dcd792a77a2f3c2a64bf"} Dec 16 13:54:27 crc kubenswrapper[4757]: I1216 13:54:27.923654 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk524" event={"ID":"8f0e3573-30b8-4ad5-b802-16ae7e1dd027","Type":"ContainerStarted","Data":"8fa2db6fafe3d2bcf5ffa1ad547fca4a2b0785bcf9defb675a255c17da88750a"} Dec 16 13:54:28 crc kubenswrapper[4757]: I1216 13:54:28.952985 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bk524" podStartSLOduration=4.437434915 podStartE2EDuration="11.952968622s" podCreationTimestamp="2025-12-16 13:54:17 +0000 UTC" firstStartedPulling="2025-12-16 13:54:20.000216495 +0000 UTC m=+4045.427960311" lastFinishedPulling="2025-12-16 13:54:27.515750222 +0000 UTC m=+4052.943494018" observedRunningTime="2025-12-16 13:54:28.947269813 +0000 UTC m=+4054.375013609" watchObservedRunningTime="2025-12-16 13:54:28.952968622 +0000 UTC m=+4054.380712418" Dec 16 13:54:29 crc kubenswrapper[4757]: I1216 13:54:29.000757 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z89sv" Dec 16 13:54:29 crc kubenswrapper[4757]: I1216 13:54:29.001835 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z89sv" Dec 16 13:54:30 crc kubenswrapper[4757]: I1216 13:54:30.055569 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-z89sv" podUID="bc70b314-f05d-40d2-ac58-d73b18bde652" containerName="registry-server" probeResult="failure" output=< Dec 16 13:54:30 crc kubenswrapper[4757]: timeout: failed to connect service ":50051" within 1s Dec 16 13:54:30 crc kubenswrapper[4757]: > Dec 16 13:54:38 crc kubenswrapper[4757]: I1216 13:54:38.325379 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bk524" Dec 16 13:54:38 crc kubenswrapper[4757]: I1216 13:54:38.326713 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bk524" Dec 16 13:54:39 crc kubenswrapper[4757]: I1216 13:54:39.046667 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z89sv" Dec 16 13:54:39 crc kubenswrapper[4757]: I1216 13:54:39.094296 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z89sv" Dec 16 13:54:39 crc kubenswrapper[4757]: I1216 13:54:39.284308 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z89sv"] Dec 16 13:54:39 crc kubenswrapper[4757]: I1216 13:54:39.835474 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bk524" podUID="8f0e3573-30b8-4ad5-b802-16ae7e1dd027" containerName="registry-server" probeResult="failure" output=< Dec 16 13:54:39 crc kubenswrapper[4757]: timeout: failed to connect service ":50051" within 1s Dec 16 13:54:39 crc kubenswrapper[4757]: > Dec 16 13:54:40 crc kubenswrapper[4757]: I1216 13:54:40.068518 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z89sv" podUID="bc70b314-f05d-40d2-ac58-d73b18bde652" containerName="registry-server" containerID="cri-o://9a3cfe61b28c8ae7e96c00d00f9606643abf3f3b08138db757128e8bc4447347" gracePeriod=2 Dec 16 13:54:41 crc kubenswrapper[4757]: I1216 13:54:41.092343 4757 generic.go:334] "Generic (PLEG): container finished" podID="bc70b314-f05d-40d2-ac58-d73b18bde652" containerID="9a3cfe61b28c8ae7e96c00d00f9606643abf3f3b08138db757128e8bc4447347" exitCode=0 Dec 16 13:54:41 crc kubenswrapper[4757]: I1216 13:54:41.092924 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z89sv" event={"ID":"bc70b314-f05d-40d2-ac58-d73b18bde652","Type":"ContainerDied","Data":"9a3cfe61b28c8ae7e96c00d00f9606643abf3f3b08138db757128e8bc4447347"} Dec 16 13:54:41 crc kubenswrapper[4757]: I1216 13:54:41.306428 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z89sv" Dec 16 13:54:41 crc kubenswrapper[4757]: I1216 13:54:41.429482 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc70b314-f05d-40d2-ac58-d73b18bde652-catalog-content\") pod \"bc70b314-f05d-40d2-ac58-d73b18bde652\" (UID: \"bc70b314-f05d-40d2-ac58-d73b18bde652\") " Dec 16 13:54:41 crc kubenswrapper[4757]: I1216 13:54:41.429711 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc70b314-f05d-40d2-ac58-d73b18bde652-utilities\") pod \"bc70b314-f05d-40d2-ac58-d73b18bde652\" (UID: \"bc70b314-f05d-40d2-ac58-d73b18bde652\") " Dec 16 13:54:41 crc kubenswrapper[4757]: I1216 13:54:41.429815 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvg4f\" (UniqueName: \"kubernetes.io/projected/bc70b314-f05d-40d2-ac58-d73b18bde652-kube-api-access-fvg4f\") pod \"bc70b314-f05d-40d2-ac58-d73b18bde652\" (UID: \"bc70b314-f05d-40d2-ac58-d73b18bde652\") " Dec 16 13:54:41 crc kubenswrapper[4757]: I1216 13:54:41.430427 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc70b314-f05d-40d2-ac58-d73b18bde652-utilities" (OuterVolumeSpecName: "utilities") pod "bc70b314-f05d-40d2-ac58-d73b18bde652" (UID: "bc70b314-f05d-40d2-ac58-d73b18bde652"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:54:41 crc kubenswrapper[4757]: I1216 13:54:41.431223 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc70b314-f05d-40d2-ac58-d73b18bde652-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:54:41 crc kubenswrapper[4757]: I1216 13:54:41.448732 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc70b314-f05d-40d2-ac58-d73b18bde652-kube-api-access-fvg4f" (OuterVolumeSpecName: "kube-api-access-fvg4f") pod "bc70b314-f05d-40d2-ac58-d73b18bde652" (UID: "bc70b314-f05d-40d2-ac58-d73b18bde652"). InnerVolumeSpecName "kube-api-access-fvg4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:54:41 crc kubenswrapper[4757]: I1216 13:54:41.465234 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc70b314-f05d-40d2-ac58-d73b18bde652-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc70b314-f05d-40d2-ac58-d73b18bde652" (UID: "bc70b314-f05d-40d2-ac58-d73b18bde652"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:54:41 crc kubenswrapper[4757]: I1216 13:54:41.534580 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc70b314-f05d-40d2-ac58-d73b18bde652-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:54:41 crc kubenswrapper[4757]: I1216 13:54:41.534651 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvg4f\" (UniqueName: \"kubernetes.io/projected/bc70b314-f05d-40d2-ac58-d73b18bde652-kube-api-access-fvg4f\") on node \"crc\" DevicePath \"\"" Dec 16 13:54:42 crc kubenswrapper[4757]: I1216 13:54:42.103642 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z89sv" event={"ID":"bc70b314-f05d-40d2-ac58-d73b18bde652","Type":"ContainerDied","Data":"af3776a9e7b2e7ced9e25daf3a2089b52ca8453f0791defeae162aed707b7201"} Dec 16 13:54:42 crc kubenswrapper[4757]: I1216 13:54:42.103986 4757 scope.go:117] "RemoveContainer" containerID="9a3cfe61b28c8ae7e96c00d00f9606643abf3f3b08138db757128e8bc4447347" Dec 16 13:54:42 crc kubenswrapper[4757]: I1216 13:54:42.103741 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z89sv" Dec 16 13:54:42 crc kubenswrapper[4757]: I1216 13:54:42.134716 4757 scope.go:117] "RemoveContainer" containerID="dfaee799d5c0df85b868b9360eaf8d7ef5e864d0dfee5a110eb902ef080dec8a" Dec 16 13:54:42 crc kubenswrapper[4757]: I1216 13:54:42.145661 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z89sv"] Dec 16 13:54:42 crc kubenswrapper[4757]: I1216 13:54:42.155839 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z89sv"] Dec 16 13:54:42 crc kubenswrapper[4757]: I1216 13:54:42.163076 4757 scope.go:117] "RemoveContainer" containerID="6cac8fb7b77faea4e3a9731d6ff34a92df5c0662f3f73d234c24f6de2b5c4abe" Dec 16 13:54:42 crc kubenswrapper[4757]: I1216 13:54:42.961047 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc70b314-f05d-40d2-ac58-d73b18bde652" path="/var/lib/kubelet/pods/bc70b314-f05d-40d2-ac58-d73b18bde652/volumes" Dec 16 13:54:48 crc kubenswrapper[4757]: I1216 13:54:48.364502 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bk524" Dec 16 13:54:48 crc kubenswrapper[4757]: I1216 13:54:48.431999 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bk524" Dec 16 13:54:49 crc kubenswrapper[4757]: I1216 13:54:49.798190 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bk524"] Dec 16 13:54:50 crc kubenswrapper[4757]: I1216 13:54:50.183168 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bk524" podUID="8f0e3573-30b8-4ad5-b802-16ae7e1dd027" containerName="registry-server" containerID="cri-o://8fa2db6fafe3d2bcf5ffa1ad547fca4a2b0785bcf9defb675a255c17da88750a" gracePeriod=2 Dec 16 13:54:51 crc kubenswrapper[4757]: I1216 13:54:51.181363 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:54:51 crc kubenswrapper[4757]: I1216 13:54:51.181703 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:54:51 crc kubenswrapper[4757]: I1216 13:54:51.181753 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 13:54:51 crc kubenswrapper[4757]: I1216 13:54:51.182589 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fc2b581c5d760a9f331945725f63ec478a166eec486ac29ae73884d9107add8d"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 13:54:51 crc kubenswrapper[4757]: I1216 13:54:51.182641 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://fc2b581c5d760a9f331945725f63ec478a166eec486ac29ae73884d9107add8d" gracePeriod=600 Dec 16 13:54:51 crc kubenswrapper[4757]: I1216 13:54:51.193704 4757 generic.go:334] "Generic (PLEG): container finished" podID="8f0e3573-30b8-4ad5-b802-16ae7e1dd027" containerID="8fa2db6fafe3d2bcf5ffa1ad547fca4a2b0785bcf9defb675a255c17da88750a" exitCode=0 Dec 16 13:54:51 crc kubenswrapper[4757]: I1216 13:54:51.193744 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk524" event={"ID":"8f0e3573-30b8-4ad5-b802-16ae7e1dd027","Type":"ContainerDied","Data":"8fa2db6fafe3d2bcf5ffa1ad547fca4a2b0785bcf9defb675a255c17da88750a"} Dec 16 13:54:51 crc kubenswrapper[4757]: I1216 13:54:51.591389 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bk524" Dec 16 13:54:51 crc kubenswrapper[4757]: I1216 13:54:51.738191 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvdsb\" (UniqueName: \"kubernetes.io/projected/8f0e3573-30b8-4ad5-b802-16ae7e1dd027-kube-api-access-cvdsb\") pod \"8f0e3573-30b8-4ad5-b802-16ae7e1dd027\" (UID: \"8f0e3573-30b8-4ad5-b802-16ae7e1dd027\") " Dec 16 13:54:51 crc kubenswrapper[4757]: I1216 13:54:51.738644 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0e3573-30b8-4ad5-b802-16ae7e1dd027-catalog-content\") pod \"8f0e3573-30b8-4ad5-b802-16ae7e1dd027\" (UID: \"8f0e3573-30b8-4ad5-b802-16ae7e1dd027\") " Dec 16 13:54:51 crc kubenswrapper[4757]: I1216 13:54:51.738827 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0e3573-30b8-4ad5-b802-16ae7e1dd027-utilities\") pod \"8f0e3573-30b8-4ad5-b802-16ae7e1dd027\" (UID: \"8f0e3573-30b8-4ad5-b802-16ae7e1dd027\") " Dec 16 13:54:51 crc kubenswrapper[4757]: I1216 13:54:51.740593 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f0e3573-30b8-4ad5-b802-16ae7e1dd027-utilities" (OuterVolumeSpecName: "utilities") pod "8f0e3573-30b8-4ad5-b802-16ae7e1dd027" (UID: "8f0e3573-30b8-4ad5-b802-16ae7e1dd027"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:54:51 crc kubenswrapper[4757]: I1216 13:54:51.758404 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0e3573-30b8-4ad5-b802-16ae7e1dd027-kube-api-access-cvdsb" (OuterVolumeSpecName: "kube-api-access-cvdsb") pod "8f0e3573-30b8-4ad5-b802-16ae7e1dd027" (UID: "8f0e3573-30b8-4ad5-b802-16ae7e1dd027"). InnerVolumeSpecName "kube-api-access-cvdsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:54:51 crc kubenswrapper[4757]: I1216 13:54:51.842317 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f0e3573-30b8-4ad5-b802-16ae7e1dd027-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:54:51 crc kubenswrapper[4757]: I1216 13:54:51.842359 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvdsb\" (UniqueName: \"kubernetes.io/projected/8f0e3573-30b8-4ad5-b802-16ae7e1dd027-kube-api-access-cvdsb\") on node \"crc\" DevicePath \"\"" Dec 16 13:54:51 crc kubenswrapper[4757]: I1216 13:54:51.888674 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f0e3573-30b8-4ad5-b802-16ae7e1dd027-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f0e3573-30b8-4ad5-b802-16ae7e1dd027" (UID: "8f0e3573-30b8-4ad5-b802-16ae7e1dd027"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:54:51 crc kubenswrapper[4757]: I1216 13:54:51.944322 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f0e3573-30b8-4ad5-b802-16ae7e1dd027-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:54:52 crc kubenswrapper[4757]: I1216 13:54:52.206884 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="fc2b581c5d760a9f331945725f63ec478a166eec486ac29ae73884d9107add8d" exitCode=0 Dec 16 13:54:52 crc kubenswrapper[4757]: I1216 13:54:52.206971 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"fc2b581c5d760a9f331945725f63ec478a166eec486ac29ae73884d9107add8d"} Dec 16 13:54:52 crc kubenswrapper[4757]: I1216 13:54:52.207533 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1"} Dec 16 13:54:52 crc kubenswrapper[4757]: I1216 13:54:52.207564 4757 scope.go:117] "RemoveContainer" containerID="e49c5ade6bf9decd15734ce6103a23323f8cfd9fa63895c9165fe510077c792a" Dec 16 13:54:52 crc kubenswrapper[4757]: I1216 13:54:52.214419 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bk524" event={"ID":"8f0e3573-30b8-4ad5-b802-16ae7e1dd027","Type":"ContainerDied","Data":"be57067bfc230eb38f19deed7b35d056ada2805b1705471f19a81ca792217c47"} Dec 16 13:54:52 crc kubenswrapper[4757]: I1216 13:54:52.214517 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bk524" Dec 16 13:54:52 crc kubenswrapper[4757]: I1216 13:54:52.257881 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bk524"] Dec 16 13:54:52 crc kubenswrapper[4757]: I1216 13:54:52.266981 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bk524"] Dec 16 13:54:52 crc kubenswrapper[4757]: I1216 13:54:52.364961 4757 scope.go:117] "RemoveContainer" containerID="8fa2db6fafe3d2bcf5ffa1ad547fca4a2b0785bcf9defb675a255c17da88750a" Dec 16 13:54:52 crc kubenswrapper[4757]: I1216 13:54:52.388484 4757 scope.go:117] "RemoveContainer" containerID="2384b0305eb5a24ea286f66872cee3399e20bddf9a20dcd792a77a2f3c2a64bf" Dec 16 13:54:52 crc kubenswrapper[4757]: I1216 13:54:52.415338 4757 scope.go:117] "RemoveContainer" containerID="20519d695f2a451fbecab521f5e0d68c9d3762c2ee65b63de0bfc43009c1a32c" Dec 16 13:54:52 crc kubenswrapper[4757]: I1216 13:54:52.959494 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f0e3573-30b8-4ad5-b802-16ae7e1dd027" path="/var/lib/kubelet/pods/8f0e3573-30b8-4ad5-b802-16ae7e1dd027/volumes" Dec 16 13:56:35 crc kubenswrapper[4757]: I1216 13:56:35.951412 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-phdrp"] Dec 16 13:56:35 crc kubenswrapper[4757]: E1216 13:56:35.952429 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0e3573-30b8-4ad5-b802-16ae7e1dd027" containerName="extract-utilities" Dec 16 13:56:35 crc kubenswrapper[4757]: I1216 13:56:35.952812 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0e3573-30b8-4ad5-b802-16ae7e1dd027" containerName="extract-utilities" Dec 16 13:56:35 crc kubenswrapper[4757]: E1216 13:56:35.952827 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc70b314-f05d-40d2-ac58-d73b18bde652" containerName="extract-content" Dec 16 13:56:35 crc kubenswrapper[4757]: I1216 13:56:35.952833 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc70b314-f05d-40d2-ac58-d73b18bde652" containerName="extract-content" Dec 16 13:56:35 crc kubenswrapper[4757]: E1216 13:56:35.953371 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0e3573-30b8-4ad5-b802-16ae7e1dd027" containerName="registry-server" Dec 16 13:56:35 crc kubenswrapper[4757]: I1216 13:56:35.953381 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0e3573-30b8-4ad5-b802-16ae7e1dd027" containerName="registry-server" Dec 16 13:56:35 crc kubenswrapper[4757]: E1216 13:56:35.953398 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc70b314-f05d-40d2-ac58-d73b18bde652" containerName="extract-utilities" Dec 16 13:56:35 crc kubenswrapper[4757]: I1216 13:56:35.953404 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc70b314-f05d-40d2-ac58-d73b18bde652" containerName="extract-utilities" Dec 16 13:56:35 crc kubenswrapper[4757]: E1216 13:56:35.953422 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc70b314-f05d-40d2-ac58-d73b18bde652" containerName="registry-server" Dec 16 13:56:35 crc kubenswrapper[4757]: I1216 13:56:35.953428 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc70b314-f05d-40d2-ac58-d73b18bde652" containerName="registry-server" Dec 16 13:56:35 crc kubenswrapper[4757]: E1216 13:56:35.953442 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0e3573-30b8-4ad5-b802-16ae7e1dd027" containerName="extract-content" Dec 16 13:56:35 crc kubenswrapper[4757]: I1216 13:56:35.953447 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0e3573-30b8-4ad5-b802-16ae7e1dd027" containerName="extract-content" Dec 16 13:56:35 crc kubenswrapper[4757]: I1216 13:56:35.953858 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc70b314-f05d-40d2-ac58-d73b18bde652" containerName="registry-server" Dec 16 13:56:35 crc kubenswrapper[4757]: I1216 13:56:35.953886 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0e3573-30b8-4ad5-b802-16ae7e1dd027" containerName="registry-server" Dec 16 13:56:35 crc kubenswrapper[4757]: I1216 13:56:35.956170 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phdrp" Dec 16 13:56:35 crc kubenswrapper[4757]: I1216 13:56:35.982129 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-phdrp"] Dec 16 13:56:36 crc kubenswrapper[4757]: I1216 13:56:36.043112 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e615d85-41f7-4d69-acfb-1e0596913032-utilities\") pod \"community-operators-phdrp\" (UID: \"6e615d85-41f7-4d69-acfb-1e0596913032\") " pod="openshift-marketplace/community-operators-phdrp" Dec 16 13:56:36 crc kubenswrapper[4757]: I1216 13:56:36.043315 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzbbf\" (UniqueName: \"kubernetes.io/projected/6e615d85-41f7-4d69-acfb-1e0596913032-kube-api-access-pzbbf\") pod \"community-operators-phdrp\" (UID: \"6e615d85-41f7-4d69-acfb-1e0596913032\") " pod="openshift-marketplace/community-operators-phdrp" Dec 16 13:56:36 crc kubenswrapper[4757]: I1216 13:56:36.043398 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e615d85-41f7-4d69-acfb-1e0596913032-catalog-content\") pod \"community-operators-phdrp\" (UID: \"6e615d85-41f7-4d69-acfb-1e0596913032\") " pod="openshift-marketplace/community-operators-phdrp" Dec 16 13:56:36 crc kubenswrapper[4757]: I1216 13:56:36.144829 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzbbf\" (UniqueName: \"kubernetes.io/projected/6e615d85-41f7-4d69-acfb-1e0596913032-kube-api-access-pzbbf\") pod \"community-operators-phdrp\" (UID: \"6e615d85-41f7-4d69-acfb-1e0596913032\") " pod="openshift-marketplace/community-operators-phdrp" Dec 16 13:56:36 crc kubenswrapper[4757]: I1216 13:56:36.144909 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e615d85-41f7-4d69-acfb-1e0596913032-catalog-content\") pod \"community-operators-phdrp\" (UID: \"6e615d85-41f7-4d69-acfb-1e0596913032\") " pod="openshift-marketplace/community-operators-phdrp" Dec 16 13:56:36 crc kubenswrapper[4757]: I1216 13:56:36.144928 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e615d85-41f7-4d69-acfb-1e0596913032-utilities\") pod \"community-operators-phdrp\" (UID: \"6e615d85-41f7-4d69-acfb-1e0596913032\") " pod="openshift-marketplace/community-operators-phdrp" Dec 16 13:56:36 crc kubenswrapper[4757]: I1216 13:56:36.145420 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e615d85-41f7-4d69-acfb-1e0596913032-catalog-content\") pod \"community-operators-phdrp\" (UID: \"6e615d85-41f7-4d69-acfb-1e0596913032\") " pod="openshift-marketplace/community-operators-phdrp" Dec 16 13:56:36 crc kubenswrapper[4757]: I1216 13:56:36.145466 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e615d85-41f7-4d69-acfb-1e0596913032-utilities\") pod \"community-operators-phdrp\" (UID: \"6e615d85-41f7-4d69-acfb-1e0596913032\") " pod="openshift-marketplace/community-operators-phdrp" Dec 16 13:56:36 crc kubenswrapper[4757]: I1216 13:56:36.168375 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzbbf\" (UniqueName: \"kubernetes.io/projected/6e615d85-41f7-4d69-acfb-1e0596913032-kube-api-access-pzbbf\") pod \"community-operators-phdrp\" (UID: \"6e615d85-41f7-4d69-acfb-1e0596913032\") " pod="openshift-marketplace/community-operators-phdrp" Dec 16 13:56:36 crc kubenswrapper[4757]: I1216 13:56:36.281351 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phdrp" Dec 16 13:56:36 crc kubenswrapper[4757]: I1216 13:56:36.890542 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-phdrp"] Dec 16 13:56:37 crc kubenswrapper[4757]: I1216 13:56:37.207393 4757 generic.go:334] "Generic (PLEG): container finished" podID="6e615d85-41f7-4d69-acfb-1e0596913032" containerID="c8e544459fac71d5022b4a9d75e1544318527a9216a7259a9ce71b4411e75ef4" exitCode=0 Dec 16 13:56:37 crc kubenswrapper[4757]: I1216 13:56:37.208427 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phdrp" event={"ID":"6e615d85-41f7-4d69-acfb-1e0596913032","Type":"ContainerDied","Data":"c8e544459fac71d5022b4a9d75e1544318527a9216a7259a9ce71b4411e75ef4"} Dec 16 13:56:37 crc kubenswrapper[4757]: I1216 13:56:37.208980 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phdrp" event={"ID":"6e615d85-41f7-4d69-acfb-1e0596913032","Type":"ContainerStarted","Data":"f8042d68110f03e218e18a073621dbe3effe5eaa615daa61320ab71016a816fa"} Dec 16 13:56:38 crc kubenswrapper[4757]: I1216 13:56:38.223852 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phdrp" event={"ID":"6e615d85-41f7-4d69-acfb-1e0596913032","Type":"ContainerStarted","Data":"38c347f6bff9ecc2c3dc003f4e48fb80f9fcc4772e77f1085b2837a8b4b51c87"} Dec 16 13:56:39 crc kubenswrapper[4757]: I1216 13:56:39.235043 4757 generic.go:334] "Generic (PLEG): container finished" podID="6e615d85-41f7-4d69-acfb-1e0596913032" containerID="38c347f6bff9ecc2c3dc003f4e48fb80f9fcc4772e77f1085b2837a8b4b51c87" exitCode=0 Dec 16 13:56:39 crc kubenswrapper[4757]: I1216 13:56:39.235113 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phdrp" event={"ID":"6e615d85-41f7-4d69-acfb-1e0596913032","Type":"ContainerDied","Data":"38c347f6bff9ecc2c3dc003f4e48fb80f9fcc4772e77f1085b2837a8b4b51c87"} Dec 16 13:56:40 crc kubenswrapper[4757]: I1216 13:56:40.247731 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phdrp" event={"ID":"6e615d85-41f7-4d69-acfb-1e0596913032","Type":"ContainerStarted","Data":"3b312ac04fbc7775e4db6453ac1717289c482cef3d7932eb4ebf00a819b0cab1"} Dec 16 13:56:40 crc kubenswrapper[4757]: I1216 13:56:40.267238 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-phdrp" podStartSLOduration=2.814583948 podStartE2EDuration="5.267221131s" podCreationTimestamp="2025-12-16 13:56:35 +0000 UTC" firstStartedPulling="2025-12-16 13:56:37.209143156 +0000 UTC m=+4182.636886952" lastFinishedPulling="2025-12-16 13:56:39.661780339 +0000 UTC m=+4185.089524135" observedRunningTime="2025-12-16 13:56:40.264582708 +0000 UTC m=+4185.692326514" watchObservedRunningTime="2025-12-16 13:56:40.267221131 +0000 UTC m=+4185.694964927" Dec 16 13:56:46 crc kubenswrapper[4757]: I1216 13:56:46.282795 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-phdrp" Dec 16 13:56:46 crc kubenswrapper[4757]: I1216 13:56:46.284846 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-phdrp" Dec 16 13:56:46 crc kubenswrapper[4757]: I1216 13:56:46.345032 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-phdrp" Dec 16 13:56:47 crc kubenswrapper[4757]: I1216 13:56:47.362321 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-phdrp" Dec 16 13:56:47 crc kubenswrapper[4757]: I1216 13:56:47.417651 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-phdrp"] Dec 16 13:56:49 crc kubenswrapper[4757]: I1216 13:56:49.341230 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-phdrp" podUID="6e615d85-41f7-4d69-acfb-1e0596913032" containerName="registry-server" containerID="cri-o://3b312ac04fbc7775e4db6453ac1717289c482cef3d7932eb4ebf00a819b0cab1" gracePeriod=2 Dec 16 13:56:50 crc kubenswrapper[4757]: I1216 13:56:50.363665 4757 generic.go:334] "Generic (PLEG): container finished" podID="6e615d85-41f7-4d69-acfb-1e0596913032" containerID="3b312ac04fbc7775e4db6453ac1717289c482cef3d7932eb4ebf00a819b0cab1" exitCode=0 Dec 16 13:56:50 crc kubenswrapper[4757]: I1216 13:56:50.363762 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phdrp" event={"ID":"6e615d85-41f7-4d69-acfb-1e0596913032","Type":"ContainerDied","Data":"3b312ac04fbc7775e4db6453ac1717289c482cef3d7932eb4ebf00a819b0cab1"} Dec 16 13:56:50 crc kubenswrapper[4757]: I1216 13:56:50.498513 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phdrp" Dec 16 13:56:50 crc kubenswrapper[4757]: I1216 13:56:50.655590 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e615d85-41f7-4d69-acfb-1e0596913032-utilities\") pod \"6e615d85-41f7-4d69-acfb-1e0596913032\" (UID: \"6e615d85-41f7-4d69-acfb-1e0596913032\") " Dec 16 13:56:50 crc kubenswrapper[4757]: I1216 13:56:50.655780 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e615d85-41f7-4d69-acfb-1e0596913032-catalog-content\") pod \"6e615d85-41f7-4d69-acfb-1e0596913032\" (UID: \"6e615d85-41f7-4d69-acfb-1e0596913032\") " Dec 16 13:56:50 crc kubenswrapper[4757]: I1216 13:56:50.655872 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzbbf\" (UniqueName: \"kubernetes.io/projected/6e615d85-41f7-4d69-acfb-1e0596913032-kube-api-access-pzbbf\") pod \"6e615d85-41f7-4d69-acfb-1e0596913032\" (UID: \"6e615d85-41f7-4d69-acfb-1e0596913032\") " Dec 16 13:56:50 crc kubenswrapper[4757]: I1216 13:56:50.656773 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e615d85-41f7-4d69-acfb-1e0596913032-utilities" (OuterVolumeSpecName: "utilities") pod "6e615d85-41f7-4d69-acfb-1e0596913032" (UID: "6e615d85-41f7-4d69-acfb-1e0596913032"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:56:50 crc kubenswrapper[4757]: I1216 13:56:50.681238 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e615d85-41f7-4d69-acfb-1e0596913032-kube-api-access-pzbbf" (OuterVolumeSpecName: "kube-api-access-pzbbf") pod "6e615d85-41f7-4d69-acfb-1e0596913032" (UID: "6e615d85-41f7-4d69-acfb-1e0596913032"). InnerVolumeSpecName "kube-api-access-pzbbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 13:56:50 crc kubenswrapper[4757]: I1216 13:56:50.725113 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e615d85-41f7-4d69-acfb-1e0596913032-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e615d85-41f7-4d69-acfb-1e0596913032" (UID: "6e615d85-41f7-4d69-acfb-1e0596913032"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 13:56:50 crc kubenswrapper[4757]: I1216 13:56:50.758945 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e615d85-41f7-4d69-acfb-1e0596913032-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 13:56:50 crc kubenswrapper[4757]: I1216 13:56:50.758998 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e615d85-41f7-4d69-acfb-1e0596913032-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 13:56:50 crc kubenswrapper[4757]: I1216 13:56:50.759041 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzbbf\" (UniqueName: \"kubernetes.io/projected/6e615d85-41f7-4d69-acfb-1e0596913032-kube-api-access-pzbbf\") on node \"crc\" DevicePath \"\"" Dec 16 13:56:51 crc kubenswrapper[4757]: I1216 13:56:51.181755 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:56:51 crc kubenswrapper[4757]: I1216 13:56:51.182086 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:56:51 crc kubenswrapper[4757]: I1216 13:56:51.376248 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phdrp" event={"ID":"6e615d85-41f7-4d69-acfb-1e0596913032","Type":"ContainerDied","Data":"f8042d68110f03e218e18a073621dbe3effe5eaa615daa61320ab71016a816fa"} Dec 16 13:56:51 crc kubenswrapper[4757]: I1216 13:56:51.376302 4757 scope.go:117] "RemoveContainer" containerID="3b312ac04fbc7775e4db6453ac1717289c482cef3d7932eb4ebf00a819b0cab1" Dec 16 13:56:51 crc kubenswrapper[4757]: I1216 13:56:51.376384 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phdrp" Dec 16 13:56:51 crc kubenswrapper[4757]: I1216 13:56:51.405731 4757 scope.go:117] "RemoveContainer" containerID="38c347f6bff9ecc2c3dc003f4e48fb80f9fcc4772e77f1085b2837a8b4b51c87" Dec 16 13:56:51 crc kubenswrapper[4757]: I1216 13:56:51.406488 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-phdrp"] Dec 16 13:56:51 crc kubenswrapper[4757]: I1216 13:56:51.415695 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-phdrp"] Dec 16 13:56:51 crc kubenswrapper[4757]: I1216 13:56:51.429522 4757 scope.go:117] "RemoveContainer" containerID="c8e544459fac71d5022b4a9d75e1544318527a9216a7259a9ce71b4411e75ef4" Dec 16 13:56:52 crc kubenswrapper[4757]: I1216 13:56:52.961961 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e615d85-41f7-4d69-acfb-1e0596913032" path="/var/lib/kubelet/pods/6e615d85-41f7-4d69-acfb-1e0596913032/volumes" Dec 16 13:57:21 crc kubenswrapper[4757]: I1216 13:57:21.180895 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:57:21 crc kubenswrapper[4757]: I1216 13:57:21.181506 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:57:51 crc kubenswrapper[4757]: I1216 13:57:51.181490 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 13:57:51 crc kubenswrapper[4757]: I1216 13:57:51.182789 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 13:57:51 crc kubenswrapper[4757]: I1216 13:57:51.182897 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 13:57:51 crc kubenswrapper[4757]: I1216 13:57:51.183720 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 13:57:51 crc kubenswrapper[4757]: I1216 13:57:51.183851 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" gracePeriod=600 Dec 16 13:57:51 crc kubenswrapper[4757]: E1216 13:57:51.303580 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:57:51 crc kubenswrapper[4757]: I1216 13:57:51.922745 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" exitCode=0 Dec 16 13:57:51 crc kubenswrapper[4757]: I1216 13:57:51.922793 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1"} Dec 16 13:57:51 crc kubenswrapper[4757]: I1216 13:57:51.922831 4757 scope.go:117] "RemoveContainer" containerID="fc2b581c5d760a9f331945725f63ec478a166eec486ac29ae73884d9107add8d" Dec 16 13:57:51 crc kubenswrapper[4757]: I1216 13:57:51.923565 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 13:57:51 crc kubenswrapper[4757]: E1216 13:57:51.923985 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:58:06 crc kubenswrapper[4757]: I1216 13:58:06.948803 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 13:58:06 crc kubenswrapper[4757]: E1216 13:58:06.949639 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:58:20 crc kubenswrapper[4757]: I1216 13:58:20.949777 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 13:58:20 crc kubenswrapper[4757]: E1216 13:58:20.952197 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:58:35 crc kubenswrapper[4757]: I1216 13:58:35.949125 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 13:58:35 crc kubenswrapper[4757]: E1216 13:58:35.949932 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:58:47 crc kubenswrapper[4757]: I1216 13:58:47.949047 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 13:58:47 crc kubenswrapper[4757]: E1216 13:58:47.949894 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:59:02 crc kubenswrapper[4757]: I1216 13:59:02.949234 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 13:59:02 crc kubenswrapper[4757]: E1216 13:59:02.950316 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:59:14 crc kubenswrapper[4757]: I1216 13:59:14.956178 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 13:59:14 crc kubenswrapper[4757]: E1216 13:59:14.957105 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:59:29 crc kubenswrapper[4757]: I1216 13:59:29.949670 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 13:59:29 crc kubenswrapper[4757]: E1216 13:59:29.950587 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:59:43 crc kubenswrapper[4757]: I1216 13:59:43.950406 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 13:59:43 crc kubenswrapper[4757]: E1216 13:59:43.951139 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 13:59:55 crc kubenswrapper[4757]: I1216 13:59:55.948544 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 13:59:55 crc kubenswrapper[4757]: E1216 13:59:55.949340 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.165655 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c"] Dec 16 14:00:00 crc kubenswrapper[4757]: E1216 14:00:00.166798 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e615d85-41f7-4d69-acfb-1e0596913032" containerName="registry-server" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.166819 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e615d85-41f7-4d69-acfb-1e0596913032" containerName="registry-server" Dec 16 14:00:00 crc kubenswrapper[4757]: E1216 14:00:00.166841 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e615d85-41f7-4d69-acfb-1e0596913032" containerName="extract-utilities" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.166851 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e615d85-41f7-4d69-acfb-1e0596913032" containerName="extract-utilities" Dec 16 14:00:00 crc kubenswrapper[4757]: E1216 14:00:00.166877 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e615d85-41f7-4d69-acfb-1e0596913032" containerName="extract-content" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.166887 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e615d85-41f7-4d69-acfb-1e0596913032" containerName="extract-content" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.167194 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e615d85-41f7-4d69-acfb-1e0596913032" containerName="registry-server" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.168143 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.170836 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.171100 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.185226 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c"] Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.205207 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wwgq\" (UniqueName: \"kubernetes.io/projected/db917622-f561-4d30-a4b9-bc45252d7400-kube-api-access-6wwgq\") pod \"collect-profiles-29431560-hsm6c\" (UID: \"db917622-f561-4d30-a4b9-bc45252d7400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.205249 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db917622-f561-4d30-a4b9-bc45252d7400-config-volume\") pod \"collect-profiles-29431560-hsm6c\" (UID: \"db917622-f561-4d30-a4b9-bc45252d7400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.205656 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db917622-f561-4d30-a4b9-bc45252d7400-secret-volume\") pod \"collect-profiles-29431560-hsm6c\" (UID: \"db917622-f561-4d30-a4b9-bc45252d7400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.308198 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wwgq\" (UniqueName: \"kubernetes.io/projected/db917622-f561-4d30-a4b9-bc45252d7400-kube-api-access-6wwgq\") pod \"collect-profiles-29431560-hsm6c\" (UID: \"db917622-f561-4d30-a4b9-bc45252d7400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.308256 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db917622-f561-4d30-a4b9-bc45252d7400-config-volume\") pod \"collect-profiles-29431560-hsm6c\" (UID: \"db917622-f561-4d30-a4b9-bc45252d7400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.308309 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db917622-f561-4d30-a4b9-bc45252d7400-secret-volume\") pod \"collect-profiles-29431560-hsm6c\" (UID: \"db917622-f561-4d30-a4b9-bc45252d7400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.309317 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db917622-f561-4d30-a4b9-bc45252d7400-config-volume\") pod \"collect-profiles-29431560-hsm6c\" (UID: \"db917622-f561-4d30-a4b9-bc45252d7400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.315812 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db917622-f561-4d30-a4b9-bc45252d7400-secret-volume\") pod \"collect-profiles-29431560-hsm6c\" (UID: \"db917622-f561-4d30-a4b9-bc45252d7400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.333383 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wwgq\" (UniqueName: \"kubernetes.io/projected/db917622-f561-4d30-a4b9-bc45252d7400-kube-api-access-6wwgq\") pod \"collect-profiles-29431560-hsm6c\" (UID: \"db917622-f561-4d30-a4b9-bc45252d7400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.506162 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c" Dec 16 14:00:00 crc kubenswrapper[4757]: I1216 14:00:00.965612 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c"] Dec 16 14:00:01 crc kubenswrapper[4757]: I1216 14:00:01.047158 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c" event={"ID":"db917622-f561-4d30-a4b9-bc45252d7400","Type":"ContainerStarted","Data":"48af3b8688f54151f334023f549a5c765d6c53f480c18749cc4dd8900bce3cb5"} Dec 16 14:00:02 crc kubenswrapper[4757]: I1216 14:00:02.058663 4757 generic.go:334] "Generic (PLEG): container finished" podID="db917622-f561-4d30-a4b9-bc45252d7400" containerID="627c0df5157ac6c48da989780f04b45ebc05bbdba0572e49c0ba223de81cfba0" exitCode=0 Dec 16 14:00:02 crc kubenswrapper[4757]: I1216 14:00:02.058968 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c" event={"ID":"db917622-f561-4d30-a4b9-bc45252d7400","Type":"ContainerDied","Data":"627c0df5157ac6c48da989780f04b45ebc05bbdba0572e49c0ba223de81cfba0"} Dec 16 14:00:03 crc kubenswrapper[4757]: I1216 14:00:03.467877 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c" Dec 16 14:00:03 crc kubenswrapper[4757]: I1216 14:00:03.566828 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wwgq\" (UniqueName: \"kubernetes.io/projected/db917622-f561-4d30-a4b9-bc45252d7400-kube-api-access-6wwgq\") pod \"db917622-f561-4d30-a4b9-bc45252d7400\" (UID: \"db917622-f561-4d30-a4b9-bc45252d7400\") " Dec 16 14:00:03 crc kubenswrapper[4757]: I1216 14:00:03.567075 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db917622-f561-4d30-a4b9-bc45252d7400-secret-volume\") pod \"db917622-f561-4d30-a4b9-bc45252d7400\" (UID: \"db917622-f561-4d30-a4b9-bc45252d7400\") " Dec 16 14:00:03 crc kubenswrapper[4757]: I1216 14:00:03.567100 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db917622-f561-4d30-a4b9-bc45252d7400-config-volume\") pod \"db917622-f561-4d30-a4b9-bc45252d7400\" (UID: \"db917622-f561-4d30-a4b9-bc45252d7400\") " Dec 16 14:00:03 crc kubenswrapper[4757]: I1216 14:00:03.568149 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db917622-f561-4d30-a4b9-bc45252d7400-config-volume" (OuterVolumeSpecName: "config-volume") pod "db917622-f561-4d30-a4b9-bc45252d7400" (UID: "db917622-f561-4d30-a4b9-bc45252d7400"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:00:03 crc kubenswrapper[4757]: I1216 14:00:03.668866 4757 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db917622-f561-4d30-a4b9-bc45252d7400-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 14:00:03 crc kubenswrapper[4757]: I1216 14:00:03.996035 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db917622-f561-4d30-a4b9-bc45252d7400-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "db917622-f561-4d30-a4b9-bc45252d7400" (UID: "db917622-f561-4d30-a4b9-bc45252d7400"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:00:04 crc kubenswrapper[4757]: I1216 14:00:04.000331 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db917622-f561-4d30-a4b9-bc45252d7400-kube-api-access-6wwgq" (OuterVolumeSpecName: "kube-api-access-6wwgq") pod "db917622-f561-4d30-a4b9-bc45252d7400" (UID: "db917622-f561-4d30-a4b9-bc45252d7400"). InnerVolumeSpecName "kube-api-access-6wwgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:00:04 crc kubenswrapper[4757]: I1216 14:00:04.074293 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c" event={"ID":"db917622-f561-4d30-a4b9-bc45252d7400","Type":"ContainerDied","Data":"48af3b8688f54151f334023f549a5c765d6c53f480c18749cc4dd8900bce3cb5"} Dec 16 14:00:04 crc kubenswrapper[4757]: I1216 14:00:04.074332 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48af3b8688f54151f334023f549a5c765d6c53f480c18749cc4dd8900bce3cb5" Dec 16 14:00:04 crc kubenswrapper[4757]: I1216 14:00:04.074395 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431560-hsm6c" Dec 16 14:00:04 crc kubenswrapper[4757]: I1216 14:00:04.078466 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wwgq\" (UniqueName: \"kubernetes.io/projected/db917622-f561-4d30-a4b9-bc45252d7400-kube-api-access-6wwgq\") on node \"crc\" DevicePath \"\"" Dec 16 14:00:04 crc kubenswrapper[4757]: I1216 14:00:04.078493 4757 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db917622-f561-4d30-a4b9-bc45252d7400-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 14:00:04 crc kubenswrapper[4757]: I1216 14:00:04.549466 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc"] Dec 16 14:00:04 crc kubenswrapper[4757]: I1216 14:00:04.558150 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431515-hw5rc"] Dec 16 14:00:04 crc kubenswrapper[4757]: I1216 14:00:04.963250 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="692db28f-b3cf-43b3-8822-fc5898543119" path="/var/lib/kubelet/pods/692db28f-b3cf-43b3-8822-fc5898543119/volumes" Dec 16 14:00:08 crc kubenswrapper[4757]: I1216 14:00:08.948761 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 14:00:08 crc kubenswrapper[4757]: E1216 14:00:08.949398 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:00:21 crc kubenswrapper[4757]: I1216 14:00:21.949260 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 14:00:21 crc kubenswrapper[4757]: E1216 14:00:21.950283 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:00:31 crc kubenswrapper[4757]: I1216 14:00:31.979359 4757 scope.go:117] "RemoveContainer" containerID="039e694a3cb005f960d1484fa244d8eff9e777fdc7c40fcef9e53d7076d20204" Dec 16 14:00:36 crc kubenswrapper[4757]: I1216 14:00:36.949607 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 14:00:36 crc kubenswrapper[4757]: E1216 14:00:36.950358 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:00:48 crc kubenswrapper[4757]: I1216 14:00:48.948928 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 14:00:48 crc kubenswrapper[4757]: E1216 14:00:48.949885 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.168354 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29431561-kfrzf"] Dec 16 14:01:00 crc kubenswrapper[4757]: E1216 14:01:00.169554 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db917622-f561-4d30-a4b9-bc45252d7400" containerName="collect-profiles" Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.169571 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="db917622-f561-4d30-a4b9-bc45252d7400" containerName="collect-profiles" Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.169759 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="db917622-f561-4d30-a4b9-bc45252d7400" containerName="collect-profiles" Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.170401 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431561-kfrzf" Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.178516 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29431561-kfrzf"] Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.366121 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66cwr\" (UniqueName: \"kubernetes.io/projected/82cd88c0-672b-4d50-ae86-edeae2da08a1-kube-api-access-66cwr\") pod \"keystone-cron-29431561-kfrzf\" (UID: \"82cd88c0-672b-4d50-ae86-edeae2da08a1\") " pod="openstack/keystone-cron-29431561-kfrzf" Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.366512 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82cd88c0-672b-4d50-ae86-edeae2da08a1-combined-ca-bundle\") pod \"keystone-cron-29431561-kfrzf\" (UID: \"82cd88c0-672b-4d50-ae86-edeae2da08a1\") " pod="openstack/keystone-cron-29431561-kfrzf" Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.366693 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82cd88c0-672b-4d50-ae86-edeae2da08a1-config-data\") pod \"keystone-cron-29431561-kfrzf\" (UID: \"82cd88c0-672b-4d50-ae86-edeae2da08a1\") " pod="openstack/keystone-cron-29431561-kfrzf" Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.367113 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82cd88c0-672b-4d50-ae86-edeae2da08a1-fernet-keys\") pod \"keystone-cron-29431561-kfrzf\" (UID: \"82cd88c0-672b-4d50-ae86-edeae2da08a1\") " pod="openstack/keystone-cron-29431561-kfrzf" Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.470665 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82cd88c0-672b-4d50-ae86-edeae2da08a1-fernet-keys\") pod \"keystone-cron-29431561-kfrzf\" (UID: \"82cd88c0-672b-4d50-ae86-edeae2da08a1\") " pod="openstack/keystone-cron-29431561-kfrzf" Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.470769 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66cwr\" (UniqueName: \"kubernetes.io/projected/82cd88c0-672b-4d50-ae86-edeae2da08a1-kube-api-access-66cwr\") pod \"keystone-cron-29431561-kfrzf\" (UID: \"82cd88c0-672b-4d50-ae86-edeae2da08a1\") " pod="openstack/keystone-cron-29431561-kfrzf" Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.470865 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82cd88c0-672b-4d50-ae86-edeae2da08a1-combined-ca-bundle\") pod \"keystone-cron-29431561-kfrzf\" (UID: \"82cd88c0-672b-4d50-ae86-edeae2da08a1\") " pod="openstack/keystone-cron-29431561-kfrzf" Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.470915 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82cd88c0-672b-4d50-ae86-edeae2da08a1-config-data\") pod \"keystone-cron-29431561-kfrzf\" (UID: \"82cd88c0-672b-4d50-ae86-edeae2da08a1\") " pod="openstack/keystone-cron-29431561-kfrzf" Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.480833 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82cd88c0-672b-4d50-ae86-edeae2da08a1-combined-ca-bundle\") pod \"keystone-cron-29431561-kfrzf\" (UID: \"82cd88c0-672b-4d50-ae86-edeae2da08a1\") " pod="openstack/keystone-cron-29431561-kfrzf" Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.481166 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82cd88c0-672b-4d50-ae86-edeae2da08a1-fernet-keys\") pod \"keystone-cron-29431561-kfrzf\" (UID: \"82cd88c0-672b-4d50-ae86-edeae2da08a1\") " pod="openstack/keystone-cron-29431561-kfrzf" Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.492999 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82cd88c0-672b-4d50-ae86-edeae2da08a1-config-data\") pod \"keystone-cron-29431561-kfrzf\" (UID: \"82cd88c0-672b-4d50-ae86-edeae2da08a1\") " pod="openstack/keystone-cron-29431561-kfrzf" Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.498807 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66cwr\" (UniqueName: \"kubernetes.io/projected/82cd88c0-672b-4d50-ae86-edeae2da08a1-kube-api-access-66cwr\") pod \"keystone-cron-29431561-kfrzf\" (UID: \"82cd88c0-672b-4d50-ae86-edeae2da08a1\") " pod="openstack/keystone-cron-29431561-kfrzf" Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.796825 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431561-kfrzf" Dec 16 14:01:00 crc kubenswrapper[4757]: I1216 14:01:00.950109 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 14:01:00 crc kubenswrapper[4757]: E1216 14:01:00.950583 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:01:01 crc kubenswrapper[4757]: I1216 14:01:01.291559 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29431561-kfrzf"] Dec 16 14:01:01 crc kubenswrapper[4757]: I1216 14:01:01.666781 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431561-kfrzf" event={"ID":"82cd88c0-672b-4d50-ae86-edeae2da08a1","Type":"ContainerStarted","Data":"a61fec5b7482bc7ee376261b571c849ff031659ceece8050ec06c439f1e3d884"} Dec 16 14:01:01 crc kubenswrapper[4757]: I1216 14:01:01.666825 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431561-kfrzf" event={"ID":"82cd88c0-672b-4d50-ae86-edeae2da08a1","Type":"ContainerStarted","Data":"22f6ce911b991685fa54c9bb356381d22f26de8e781cf8f06ed31180852cbeda"} Dec 16 14:01:01 crc kubenswrapper[4757]: I1216 14:01:01.692746 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29431561-kfrzf" podStartSLOduration=1.692722221 podStartE2EDuration="1.692722221s" podCreationTimestamp="2025-12-16 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:01:01.689499253 +0000 UTC m=+4447.117243049" watchObservedRunningTime="2025-12-16 14:01:01.692722221 +0000 UTC m=+4447.120466027" Dec 16 14:01:05 crc kubenswrapper[4757]: I1216 14:01:05.698485 4757 generic.go:334] "Generic (PLEG): container finished" podID="82cd88c0-672b-4d50-ae86-edeae2da08a1" containerID="a61fec5b7482bc7ee376261b571c849ff031659ceece8050ec06c439f1e3d884" exitCode=0 Dec 16 14:01:05 crc kubenswrapper[4757]: I1216 14:01:05.698556 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431561-kfrzf" event={"ID":"82cd88c0-672b-4d50-ae86-edeae2da08a1","Type":"ContainerDied","Data":"a61fec5b7482bc7ee376261b571c849ff031659ceece8050ec06c439f1e3d884"} Dec 16 14:01:07 crc kubenswrapper[4757]: I1216 14:01:07.393457 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431561-kfrzf" Dec 16 14:01:07 crc kubenswrapper[4757]: I1216 14:01:07.502388 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82cd88c0-672b-4d50-ae86-edeae2da08a1-combined-ca-bundle\") pod \"82cd88c0-672b-4d50-ae86-edeae2da08a1\" (UID: \"82cd88c0-672b-4d50-ae86-edeae2da08a1\") " Dec 16 14:01:07 crc kubenswrapper[4757]: I1216 14:01:07.502446 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66cwr\" (UniqueName: \"kubernetes.io/projected/82cd88c0-672b-4d50-ae86-edeae2da08a1-kube-api-access-66cwr\") pod \"82cd88c0-672b-4d50-ae86-edeae2da08a1\" (UID: \"82cd88c0-672b-4d50-ae86-edeae2da08a1\") " Dec 16 14:01:07 crc kubenswrapper[4757]: I1216 14:01:07.502547 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82cd88c0-672b-4d50-ae86-edeae2da08a1-fernet-keys\") pod \"82cd88c0-672b-4d50-ae86-edeae2da08a1\" (UID: \"82cd88c0-672b-4d50-ae86-edeae2da08a1\") " Dec 16 14:01:07 crc kubenswrapper[4757]: I1216 14:01:07.502629 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82cd88c0-672b-4d50-ae86-edeae2da08a1-config-data\") pod \"82cd88c0-672b-4d50-ae86-edeae2da08a1\" (UID: \"82cd88c0-672b-4d50-ae86-edeae2da08a1\") " Dec 16 14:01:07 crc kubenswrapper[4757]: I1216 14:01:07.508381 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82cd88c0-672b-4d50-ae86-edeae2da08a1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "82cd88c0-672b-4d50-ae86-edeae2da08a1" (UID: "82cd88c0-672b-4d50-ae86-edeae2da08a1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:01:07 crc kubenswrapper[4757]: I1216 14:01:07.515410 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82cd88c0-672b-4d50-ae86-edeae2da08a1-kube-api-access-66cwr" (OuterVolumeSpecName: "kube-api-access-66cwr") pod "82cd88c0-672b-4d50-ae86-edeae2da08a1" (UID: "82cd88c0-672b-4d50-ae86-edeae2da08a1"). InnerVolumeSpecName "kube-api-access-66cwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:01:07 crc kubenswrapper[4757]: I1216 14:01:07.536197 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82cd88c0-672b-4d50-ae86-edeae2da08a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82cd88c0-672b-4d50-ae86-edeae2da08a1" (UID: "82cd88c0-672b-4d50-ae86-edeae2da08a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:01:07 crc kubenswrapper[4757]: I1216 14:01:07.578265 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82cd88c0-672b-4d50-ae86-edeae2da08a1-config-data" (OuterVolumeSpecName: "config-data") pod "82cd88c0-672b-4d50-ae86-edeae2da08a1" (UID: "82cd88c0-672b-4d50-ae86-edeae2da08a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:01:07 crc kubenswrapper[4757]: I1216 14:01:07.605307 4757 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82cd88c0-672b-4d50-ae86-edeae2da08a1-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 14:01:07 crc kubenswrapper[4757]: I1216 14:01:07.605358 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82cd88c0-672b-4d50-ae86-edeae2da08a1-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 14:01:07 crc kubenswrapper[4757]: I1216 14:01:07.605368 4757 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82cd88c0-672b-4d50-ae86-edeae2da08a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 14:01:07 crc kubenswrapper[4757]: I1216 14:01:07.605379 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66cwr\" (UniqueName: \"kubernetes.io/projected/82cd88c0-672b-4d50-ae86-edeae2da08a1-kube-api-access-66cwr\") on node \"crc\" DevicePath \"\"" Dec 16 14:01:07 crc kubenswrapper[4757]: I1216 14:01:07.714821 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431561-kfrzf" event={"ID":"82cd88c0-672b-4d50-ae86-edeae2da08a1","Type":"ContainerDied","Data":"22f6ce911b991685fa54c9bb356381d22f26de8e781cf8f06ed31180852cbeda"} Dec 16 14:01:07 crc kubenswrapper[4757]: I1216 14:01:07.715039 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22f6ce911b991685fa54c9bb356381d22f26de8e781cf8f06ed31180852cbeda" Dec 16 14:01:07 crc kubenswrapper[4757]: I1216 14:01:07.715147 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431561-kfrzf" Dec 16 14:01:15 crc kubenswrapper[4757]: I1216 14:01:15.949358 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 14:01:15 crc kubenswrapper[4757]: E1216 14:01:15.950150 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:01:28 crc kubenswrapper[4757]: I1216 14:01:28.949398 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 14:01:28 crc kubenswrapper[4757]: E1216 14:01:28.950311 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:01:42 crc kubenswrapper[4757]: I1216 14:01:42.949323 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 14:01:42 crc kubenswrapper[4757]: E1216 14:01:42.950473 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:01:55 crc kubenswrapper[4757]: I1216 14:01:55.949067 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 14:01:55 crc kubenswrapper[4757]: E1216 14:01:55.950070 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:02:07 crc kubenswrapper[4757]: I1216 14:02:07.949660 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 14:02:07 crc kubenswrapper[4757]: E1216 14:02:07.950281 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:02:20 crc kubenswrapper[4757]: I1216 14:02:20.948850 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 14:02:20 crc kubenswrapper[4757]: E1216 14:02:20.949563 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:02:33 crc kubenswrapper[4757]: I1216 14:02:33.949181 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 14:02:33 crc kubenswrapper[4757]: E1216 14:02:33.949904 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:02:45 crc kubenswrapper[4757]: I1216 14:02:45.949322 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 14:02:45 crc kubenswrapper[4757]: E1216 14:02:45.950239 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:02:57 crc kubenswrapper[4757]: I1216 14:02:57.948684 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 14:02:58 crc kubenswrapper[4757]: I1216 14:02:58.736898 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"c91d48e57b3f8247554192d083c76d7b456728814ff6a8c6a586d7bb61f94fa1"} Dec 16 14:03:01 crc kubenswrapper[4757]: I1216 14:03:01.013063 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-548qx"] Dec 16 14:03:01 crc kubenswrapper[4757]: E1216 14:03:01.014234 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82cd88c0-672b-4d50-ae86-edeae2da08a1" containerName="keystone-cron" Dec 16 14:03:01 crc kubenswrapper[4757]: I1216 14:03:01.014250 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="82cd88c0-672b-4d50-ae86-edeae2da08a1" containerName="keystone-cron" Dec 16 14:03:01 crc kubenswrapper[4757]: I1216 14:03:01.014464 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="82cd88c0-672b-4d50-ae86-edeae2da08a1" containerName="keystone-cron" Dec 16 14:03:01 crc kubenswrapper[4757]: I1216 14:03:01.016117 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-548qx" Dec 16 14:03:01 crc kubenswrapper[4757]: I1216 14:03:01.027488 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-548qx"] Dec 16 14:03:01 crc kubenswrapper[4757]: I1216 14:03:01.168717 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtbk7\" (UniqueName: \"kubernetes.io/projected/34c0c5e7-afa0-4ef6-87bc-615917721f64-kube-api-access-gtbk7\") pod \"certified-operators-548qx\" (UID: \"34c0c5e7-afa0-4ef6-87bc-615917721f64\") " pod="openshift-marketplace/certified-operators-548qx" Dec 16 14:03:01 crc kubenswrapper[4757]: I1216 14:03:01.168781 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c0c5e7-afa0-4ef6-87bc-615917721f64-utilities\") pod \"certified-operators-548qx\" (UID: \"34c0c5e7-afa0-4ef6-87bc-615917721f64\") " pod="openshift-marketplace/certified-operators-548qx" Dec 16 14:03:01 crc kubenswrapper[4757]: I1216 14:03:01.168805 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c0c5e7-afa0-4ef6-87bc-615917721f64-catalog-content\") pod \"certified-operators-548qx\" (UID: \"34c0c5e7-afa0-4ef6-87bc-615917721f64\") " pod="openshift-marketplace/certified-operators-548qx" Dec 16 14:03:01 crc kubenswrapper[4757]: I1216 14:03:01.270401 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtbk7\" (UniqueName: \"kubernetes.io/projected/34c0c5e7-afa0-4ef6-87bc-615917721f64-kube-api-access-gtbk7\") pod \"certified-operators-548qx\" (UID: \"34c0c5e7-afa0-4ef6-87bc-615917721f64\") " pod="openshift-marketplace/certified-operators-548qx" Dec 16 14:03:01 crc kubenswrapper[4757]: I1216 14:03:01.270838 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c0c5e7-afa0-4ef6-87bc-615917721f64-utilities\") pod \"certified-operators-548qx\" (UID: \"34c0c5e7-afa0-4ef6-87bc-615917721f64\") " pod="openshift-marketplace/certified-operators-548qx" Dec 16 14:03:01 crc kubenswrapper[4757]: I1216 14:03:01.271356 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c0c5e7-afa0-4ef6-87bc-615917721f64-utilities\") pod \"certified-operators-548qx\" (UID: \"34c0c5e7-afa0-4ef6-87bc-615917721f64\") " pod="openshift-marketplace/certified-operators-548qx" Dec 16 14:03:01 crc kubenswrapper[4757]: I1216 14:03:01.270873 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c0c5e7-afa0-4ef6-87bc-615917721f64-catalog-content\") pod \"certified-operators-548qx\" (UID: \"34c0c5e7-afa0-4ef6-87bc-615917721f64\") " pod="openshift-marketplace/certified-operators-548qx" Dec 16 14:03:01 crc kubenswrapper[4757]: I1216 14:03:01.271389 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c0c5e7-afa0-4ef6-87bc-615917721f64-catalog-content\") pod \"certified-operators-548qx\" (UID: \"34c0c5e7-afa0-4ef6-87bc-615917721f64\") " pod="openshift-marketplace/certified-operators-548qx" Dec 16 14:03:01 crc kubenswrapper[4757]: I1216 14:03:01.296078 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtbk7\" (UniqueName: \"kubernetes.io/projected/34c0c5e7-afa0-4ef6-87bc-615917721f64-kube-api-access-gtbk7\") pod \"certified-operators-548qx\" (UID: \"34c0c5e7-afa0-4ef6-87bc-615917721f64\") " pod="openshift-marketplace/certified-operators-548qx" Dec 16 14:03:01 crc kubenswrapper[4757]: I1216 14:03:01.334552 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-548qx" Dec 16 14:03:01 crc kubenswrapper[4757]: I1216 14:03:01.862806 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-548qx"] Dec 16 14:03:02 crc kubenswrapper[4757]: I1216 14:03:02.784175 4757 generic.go:334] "Generic (PLEG): container finished" podID="34c0c5e7-afa0-4ef6-87bc-615917721f64" containerID="964cece4e8d734c187333e00db6a40ed174b001f6b5e78a8a5b42a8514cd6aeb" exitCode=0 Dec 16 14:03:02 crc kubenswrapper[4757]: I1216 14:03:02.784401 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-548qx" event={"ID":"34c0c5e7-afa0-4ef6-87bc-615917721f64","Type":"ContainerDied","Data":"964cece4e8d734c187333e00db6a40ed174b001f6b5e78a8a5b42a8514cd6aeb"} Dec 16 14:03:02 crc kubenswrapper[4757]: I1216 14:03:02.785888 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-548qx" event={"ID":"34c0c5e7-afa0-4ef6-87bc-615917721f64","Type":"ContainerStarted","Data":"552fbd11ded7b899edf95675ced07bbf6080ff95ac00552697e89f44801e40db"} Dec 16 14:03:02 crc kubenswrapper[4757]: I1216 14:03:02.787229 4757 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 14:03:04 crc kubenswrapper[4757]: I1216 14:03:04.811537 4757 generic.go:334] "Generic (PLEG): container finished" podID="34c0c5e7-afa0-4ef6-87bc-615917721f64" containerID="5a622512f0ebb4ec1bd6414c6fcae192de82dfc2fc19ba44ea97dda903185c37" exitCode=0 Dec 16 14:03:04 crc kubenswrapper[4757]: I1216 14:03:04.812373 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-548qx" event={"ID":"34c0c5e7-afa0-4ef6-87bc-615917721f64","Type":"ContainerDied","Data":"5a622512f0ebb4ec1bd6414c6fcae192de82dfc2fc19ba44ea97dda903185c37"} Dec 16 14:03:05 crc kubenswrapper[4757]: I1216 14:03:05.821467 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-548qx" event={"ID":"34c0c5e7-afa0-4ef6-87bc-615917721f64","Type":"ContainerStarted","Data":"29d751e63c0d448053958de7adc87d31b79365f42f3c8ff2587db119a72b7014"} Dec 16 14:03:05 crc kubenswrapper[4757]: I1216 14:03:05.841770 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-548qx" podStartSLOduration=3.194312798 podStartE2EDuration="5.841752768s" podCreationTimestamp="2025-12-16 14:03:00 +0000 UTC" firstStartedPulling="2025-12-16 14:03:02.786930922 +0000 UTC m=+4568.214674718" lastFinishedPulling="2025-12-16 14:03:05.434370892 +0000 UTC m=+4570.862114688" observedRunningTime="2025-12-16 14:03:05.838641633 +0000 UTC m=+4571.266385459" watchObservedRunningTime="2025-12-16 14:03:05.841752768 +0000 UTC m=+4571.269496564" Dec 16 14:03:11 crc kubenswrapper[4757]: I1216 14:03:11.335591 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-548qx" Dec 16 14:03:11 crc kubenswrapper[4757]: I1216 14:03:11.336094 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-548qx" Dec 16 14:03:11 crc kubenswrapper[4757]: I1216 14:03:11.382510 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-548qx" Dec 16 14:03:11 crc kubenswrapper[4757]: I1216 14:03:11.921677 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-548qx" Dec 16 14:03:11 crc kubenswrapper[4757]: I1216 14:03:11.977345 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-548qx"] Dec 16 14:03:13 crc kubenswrapper[4757]: I1216 14:03:13.895641 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-548qx" podUID="34c0c5e7-afa0-4ef6-87bc-615917721f64" containerName="registry-server" containerID="cri-o://29d751e63c0d448053958de7adc87d31b79365f42f3c8ff2587db119a72b7014" gracePeriod=2 Dec 16 14:03:14 crc kubenswrapper[4757]: I1216 14:03:14.907376 4757 generic.go:334] "Generic (PLEG): container finished" podID="34c0c5e7-afa0-4ef6-87bc-615917721f64" containerID="29d751e63c0d448053958de7adc87d31b79365f42f3c8ff2587db119a72b7014" exitCode=0 Dec 16 14:03:14 crc kubenswrapper[4757]: I1216 14:03:14.907415 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-548qx" event={"ID":"34c0c5e7-afa0-4ef6-87bc-615917721f64","Type":"ContainerDied","Data":"29d751e63c0d448053958de7adc87d31b79365f42f3c8ff2587db119a72b7014"} Dec 16 14:03:14 crc kubenswrapper[4757]: I1216 14:03:14.907713 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-548qx" event={"ID":"34c0c5e7-afa0-4ef6-87bc-615917721f64","Type":"ContainerDied","Data":"552fbd11ded7b899edf95675ced07bbf6080ff95ac00552697e89f44801e40db"} Dec 16 14:03:14 crc kubenswrapper[4757]: I1216 14:03:14.907730 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="552fbd11ded7b899edf95675ced07bbf6080ff95ac00552697e89f44801e40db" Dec 16 14:03:15 crc kubenswrapper[4757]: I1216 14:03:15.077491 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-548qx" Dec 16 14:03:15 crc kubenswrapper[4757]: I1216 14:03:15.149819 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c0c5e7-afa0-4ef6-87bc-615917721f64-utilities\") pod \"34c0c5e7-afa0-4ef6-87bc-615917721f64\" (UID: \"34c0c5e7-afa0-4ef6-87bc-615917721f64\") " Dec 16 14:03:15 crc kubenswrapper[4757]: I1216 14:03:15.149936 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c0c5e7-afa0-4ef6-87bc-615917721f64-catalog-content\") pod \"34c0c5e7-afa0-4ef6-87bc-615917721f64\" (UID: \"34c0c5e7-afa0-4ef6-87bc-615917721f64\") " Dec 16 14:03:15 crc kubenswrapper[4757]: I1216 14:03:15.149972 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtbk7\" (UniqueName: \"kubernetes.io/projected/34c0c5e7-afa0-4ef6-87bc-615917721f64-kube-api-access-gtbk7\") pod \"34c0c5e7-afa0-4ef6-87bc-615917721f64\" (UID: \"34c0c5e7-afa0-4ef6-87bc-615917721f64\") " Dec 16 14:03:15 crc kubenswrapper[4757]: I1216 14:03:15.150576 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34c0c5e7-afa0-4ef6-87bc-615917721f64-utilities" (OuterVolumeSpecName: "utilities") pod "34c0c5e7-afa0-4ef6-87bc-615917721f64" (UID: "34c0c5e7-afa0-4ef6-87bc-615917721f64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:03:15 crc kubenswrapper[4757]: I1216 14:03:15.156850 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34c0c5e7-afa0-4ef6-87bc-615917721f64-kube-api-access-gtbk7" (OuterVolumeSpecName: "kube-api-access-gtbk7") pod "34c0c5e7-afa0-4ef6-87bc-615917721f64" (UID: "34c0c5e7-afa0-4ef6-87bc-615917721f64"). InnerVolumeSpecName "kube-api-access-gtbk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:03:15 crc kubenswrapper[4757]: I1216 14:03:15.204101 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34c0c5e7-afa0-4ef6-87bc-615917721f64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34c0c5e7-afa0-4ef6-87bc-615917721f64" (UID: "34c0c5e7-afa0-4ef6-87bc-615917721f64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:03:15 crc kubenswrapper[4757]: I1216 14:03:15.251551 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c0c5e7-afa0-4ef6-87bc-615917721f64-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:03:15 crc kubenswrapper[4757]: I1216 14:03:15.251584 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtbk7\" (UniqueName: \"kubernetes.io/projected/34c0c5e7-afa0-4ef6-87bc-615917721f64-kube-api-access-gtbk7\") on node \"crc\" DevicePath \"\"" Dec 16 14:03:15 crc kubenswrapper[4757]: I1216 14:03:15.251595 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c0c5e7-afa0-4ef6-87bc-615917721f64-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:03:15 crc kubenswrapper[4757]: I1216 14:03:15.914834 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-548qx" Dec 16 14:03:15 crc kubenswrapper[4757]: I1216 14:03:15.995129 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-548qx"] Dec 16 14:03:16 crc kubenswrapper[4757]: I1216 14:03:16.003161 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-548qx"] Dec 16 14:03:16 crc kubenswrapper[4757]: I1216 14:03:16.961342 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34c0c5e7-afa0-4ef6-87bc-615917721f64" path="/var/lib/kubelet/pods/34c0c5e7-afa0-4ef6-87bc-615917721f64/volumes" Dec 16 14:05:21 crc kubenswrapper[4757]: I1216 14:05:21.181228 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 14:05:21 crc kubenswrapper[4757]: I1216 14:05:21.181850 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 14:05:51 crc kubenswrapper[4757]: I1216 14:05:51.180936 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 14:05:51 crc kubenswrapper[4757]: I1216 14:05:51.181621 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 14:06:12 crc kubenswrapper[4757]: I1216 14:06:12.317078 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7mrgw"] Dec 16 14:06:12 crc kubenswrapper[4757]: E1216 14:06:12.318112 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34c0c5e7-afa0-4ef6-87bc-615917721f64" containerName="extract-content" Dec 16 14:06:12 crc kubenswrapper[4757]: I1216 14:06:12.318131 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c0c5e7-afa0-4ef6-87bc-615917721f64" containerName="extract-content" Dec 16 14:06:12 crc kubenswrapper[4757]: E1216 14:06:12.318148 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34c0c5e7-afa0-4ef6-87bc-615917721f64" containerName="registry-server" Dec 16 14:06:12 crc kubenswrapper[4757]: I1216 14:06:12.318158 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c0c5e7-afa0-4ef6-87bc-615917721f64" containerName="registry-server" Dec 16 14:06:12 crc kubenswrapper[4757]: E1216 14:06:12.318187 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34c0c5e7-afa0-4ef6-87bc-615917721f64" containerName="extract-utilities" Dec 16 14:06:12 crc kubenswrapper[4757]: I1216 14:06:12.318196 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c0c5e7-afa0-4ef6-87bc-615917721f64" containerName="extract-utilities" Dec 16 14:06:12 crc kubenswrapper[4757]: I1216 14:06:12.318507 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="34c0c5e7-afa0-4ef6-87bc-615917721f64" containerName="registry-server" Dec 16 14:06:12 crc kubenswrapper[4757]: I1216 14:06:12.320150 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7mrgw" Dec 16 14:06:12 crc kubenswrapper[4757]: I1216 14:06:12.333069 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7mrgw"] Dec 16 14:06:12 crc kubenswrapper[4757]: I1216 14:06:12.371509 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcbg5\" (UniqueName: \"kubernetes.io/projected/a3661e80-939f-4c44-81bc-10cb1fa010c6-kube-api-access-gcbg5\") pod \"redhat-operators-7mrgw\" (UID: \"a3661e80-939f-4c44-81bc-10cb1fa010c6\") " pod="openshift-marketplace/redhat-operators-7mrgw" Dec 16 14:06:12 crc kubenswrapper[4757]: I1216 14:06:12.371663 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3661e80-939f-4c44-81bc-10cb1fa010c6-utilities\") pod \"redhat-operators-7mrgw\" (UID: \"a3661e80-939f-4c44-81bc-10cb1fa010c6\") " pod="openshift-marketplace/redhat-operators-7mrgw" Dec 16 14:06:12 crc kubenswrapper[4757]: I1216 14:06:12.371708 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3661e80-939f-4c44-81bc-10cb1fa010c6-catalog-content\") pod \"redhat-operators-7mrgw\" (UID: \"a3661e80-939f-4c44-81bc-10cb1fa010c6\") " pod="openshift-marketplace/redhat-operators-7mrgw" Dec 16 14:06:12 crc kubenswrapper[4757]: I1216 14:06:12.473786 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3661e80-939f-4c44-81bc-10cb1fa010c6-utilities\") pod \"redhat-operators-7mrgw\" (UID: \"a3661e80-939f-4c44-81bc-10cb1fa010c6\") " pod="openshift-marketplace/redhat-operators-7mrgw" Dec 16 14:06:12 crc kubenswrapper[4757]: I1216 14:06:12.473852 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3661e80-939f-4c44-81bc-10cb1fa010c6-catalog-content\") pod \"redhat-operators-7mrgw\" (UID: \"a3661e80-939f-4c44-81bc-10cb1fa010c6\") " pod="openshift-marketplace/redhat-operators-7mrgw" Dec 16 14:06:12 crc kubenswrapper[4757]: I1216 14:06:12.473957 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcbg5\" (UniqueName: \"kubernetes.io/projected/a3661e80-939f-4c44-81bc-10cb1fa010c6-kube-api-access-gcbg5\") pod \"redhat-operators-7mrgw\" (UID: \"a3661e80-939f-4c44-81bc-10cb1fa010c6\") " pod="openshift-marketplace/redhat-operators-7mrgw" Dec 16 14:06:12 crc kubenswrapper[4757]: I1216 14:06:12.474530 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3661e80-939f-4c44-81bc-10cb1fa010c6-catalog-content\") pod \"redhat-operators-7mrgw\" (UID: \"a3661e80-939f-4c44-81bc-10cb1fa010c6\") " pod="openshift-marketplace/redhat-operators-7mrgw" Dec 16 14:06:12 crc kubenswrapper[4757]: I1216 14:06:12.474757 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3661e80-939f-4c44-81bc-10cb1fa010c6-utilities\") pod \"redhat-operators-7mrgw\" (UID: \"a3661e80-939f-4c44-81bc-10cb1fa010c6\") " pod="openshift-marketplace/redhat-operators-7mrgw" Dec 16 14:06:12 crc kubenswrapper[4757]: I1216 14:06:12.497036 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcbg5\" (UniqueName: \"kubernetes.io/projected/a3661e80-939f-4c44-81bc-10cb1fa010c6-kube-api-access-gcbg5\") pod \"redhat-operators-7mrgw\" (UID: \"a3661e80-939f-4c44-81bc-10cb1fa010c6\") " pod="openshift-marketplace/redhat-operators-7mrgw" Dec 16 14:06:12 crc kubenswrapper[4757]: I1216 14:06:12.647478 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7mrgw" Dec 16 14:06:13 crc kubenswrapper[4757]: I1216 14:06:13.615089 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7mrgw"] Dec 16 14:06:14 crc kubenswrapper[4757]: E1216 14:06:14.068877 4757 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3661e80_939f_4c44_81bc_10cb1fa010c6.slice/crio-conmon-548de378cd1209251e321020d217bb1117e1c34fa8b7f3c3f1e5db547f5c8b7c.scope\": RecentStats: unable to find data in memory cache]" Dec 16 14:06:14 crc kubenswrapper[4757]: I1216 14:06:14.454642 4757 generic.go:334] "Generic (PLEG): container finished" podID="a3661e80-939f-4c44-81bc-10cb1fa010c6" containerID="548de378cd1209251e321020d217bb1117e1c34fa8b7f3c3f1e5db547f5c8b7c" exitCode=0 Dec 16 14:06:14 crc kubenswrapper[4757]: I1216 14:06:14.454685 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mrgw" event={"ID":"a3661e80-939f-4c44-81bc-10cb1fa010c6","Type":"ContainerDied","Data":"548de378cd1209251e321020d217bb1117e1c34fa8b7f3c3f1e5db547f5c8b7c"} Dec 16 14:06:14 crc kubenswrapper[4757]: I1216 14:06:14.454709 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mrgw" event={"ID":"a3661e80-939f-4c44-81bc-10cb1fa010c6","Type":"ContainerStarted","Data":"8bf83366b1f7cd69142f9424e93fbf69b8297dd0644016da73b9325839cf2974"} Dec 16 14:06:16 crc kubenswrapper[4757]: I1216 14:06:16.479040 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mrgw" event={"ID":"a3661e80-939f-4c44-81bc-10cb1fa010c6","Type":"ContainerStarted","Data":"71b2c2a36120dfeff26a6932cf1ae167a496f5e6ec5390ff9c7b3395e0ab8f9e"} Dec 16 14:06:20 crc kubenswrapper[4757]: I1216 14:06:20.520447 4757 generic.go:334] "Generic (PLEG): container finished" podID="a3661e80-939f-4c44-81bc-10cb1fa010c6" containerID="71b2c2a36120dfeff26a6932cf1ae167a496f5e6ec5390ff9c7b3395e0ab8f9e" exitCode=0 Dec 16 14:06:20 crc kubenswrapper[4757]: I1216 14:06:20.520489 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mrgw" event={"ID":"a3661e80-939f-4c44-81bc-10cb1fa010c6","Type":"ContainerDied","Data":"71b2c2a36120dfeff26a6932cf1ae167a496f5e6ec5390ff9c7b3395e0ab8f9e"} Dec 16 14:06:21 crc kubenswrapper[4757]: I1216 14:06:21.181935 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 14:06:21 crc kubenswrapper[4757]: I1216 14:06:21.182046 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 14:06:21 crc kubenswrapper[4757]: I1216 14:06:21.182099 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 14:06:21 crc kubenswrapper[4757]: I1216 14:06:21.182906 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c91d48e57b3f8247554192d083c76d7b456728814ff6a8c6a586d7bb61f94fa1"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 14:06:21 crc kubenswrapper[4757]: I1216 14:06:21.182981 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://c91d48e57b3f8247554192d083c76d7b456728814ff6a8c6a586d7bb61f94fa1" gracePeriod=600 Dec 16 14:06:21 crc kubenswrapper[4757]: I1216 14:06:21.532731 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mrgw" event={"ID":"a3661e80-939f-4c44-81bc-10cb1fa010c6","Type":"ContainerStarted","Data":"48867fe4b6b71412f244291b73f4cb6c4a26a1dbb75f1cb34cf5563b516b0ae9"} Dec 16 14:06:21 crc kubenswrapper[4757]: I1216 14:06:21.536577 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="c91d48e57b3f8247554192d083c76d7b456728814ff6a8c6a586d7bb61f94fa1" exitCode=0 Dec 16 14:06:21 crc kubenswrapper[4757]: I1216 14:06:21.536612 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"c91d48e57b3f8247554192d083c76d7b456728814ff6a8c6a586d7bb61f94fa1"} Dec 16 14:06:21 crc kubenswrapper[4757]: I1216 14:06:21.536633 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba"} Dec 16 14:06:21 crc kubenswrapper[4757]: I1216 14:06:21.536648 4757 scope.go:117] "RemoveContainer" containerID="4883ad7f9c16cad4eb8535b172f916e18544c811b5409aa8e2de3e221a56a2f1" Dec 16 14:06:21 crc kubenswrapper[4757]: I1216 14:06:21.565481 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7mrgw" podStartSLOduration=2.812410226 podStartE2EDuration="9.565454194s" podCreationTimestamp="2025-12-16 14:06:12 +0000 UTC" firstStartedPulling="2025-12-16 14:06:14.456422574 +0000 UTC m=+4759.884166370" lastFinishedPulling="2025-12-16 14:06:21.209466542 +0000 UTC m=+4766.637210338" observedRunningTime="2025-12-16 14:06:21.562575304 +0000 UTC m=+4766.990319100" watchObservedRunningTime="2025-12-16 14:06:21.565454194 +0000 UTC m=+4766.993197990" Dec 16 14:06:22 crc kubenswrapper[4757]: I1216 14:06:22.647841 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7mrgw" Dec 16 14:06:22 crc kubenswrapper[4757]: I1216 14:06:22.648552 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7mrgw" Dec 16 14:06:23 crc kubenswrapper[4757]: I1216 14:06:23.693044 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7mrgw" podUID="a3661e80-939f-4c44-81bc-10cb1fa010c6" containerName="registry-server" probeResult="failure" output=< Dec 16 14:06:23 crc kubenswrapper[4757]: timeout: failed to connect service ":50051" within 1s Dec 16 14:06:23 crc kubenswrapper[4757]: > Dec 16 14:06:27 crc kubenswrapper[4757]: I1216 14:06:27.785037 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4bmzc"] Dec 16 14:06:27 crc kubenswrapper[4757]: I1216 14:06:27.787417 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4bmzc" Dec 16 14:06:27 crc kubenswrapper[4757]: I1216 14:06:27.802159 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4bmzc"] Dec 16 14:06:27 crc kubenswrapper[4757]: I1216 14:06:27.899255 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c3ada82-6905-403c-ab19-78623d1d0c93-catalog-content\") pod \"redhat-marketplace-4bmzc\" (UID: \"8c3ada82-6905-403c-ab19-78623d1d0c93\") " pod="openshift-marketplace/redhat-marketplace-4bmzc" Dec 16 14:06:27 crc kubenswrapper[4757]: I1216 14:06:27.899302 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c3ada82-6905-403c-ab19-78623d1d0c93-utilities\") pod \"redhat-marketplace-4bmzc\" (UID: \"8c3ada82-6905-403c-ab19-78623d1d0c93\") " pod="openshift-marketplace/redhat-marketplace-4bmzc" Dec 16 14:06:27 crc kubenswrapper[4757]: I1216 14:06:27.899366 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8fv6\" (UniqueName: \"kubernetes.io/projected/8c3ada82-6905-403c-ab19-78623d1d0c93-kube-api-access-v8fv6\") pod \"redhat-marketplace-4bmzc\" (UID: \"8c3ada82-6905-403c-ab19-78623d1d0c93\") " pod="openshift-marketplace/redhat-marketplace-4bmzc" Dec 16 14:06:28 crc kubenswrapper[4757]: I1216 14:06:28.000771 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8fv6\" (UniqueName: \"kubernetes.io/projected/8c3ada82-6905-403c-ab19-78623d1d0c93-kube-api-access-v8fv6\") pod \"redhat-marketplace-4bmzc\" (UID: \"8c3ada82-6905-403c-ab19-78623d1d0c93\") " pod="openshift-marketplace/redhat-marketplace-4bmzc" Dec 16 14:06:28 crc kubenswrapper[4757]: I1216 14:06:28.001168 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c3ada82-6905-403c-ab19-78623d1d0c93-catalog-content\") pod \"redhat-marketplace-4bmzc\" (UID: \"8c3ada82-6905-403c-ab19-78623d1d0c93\") " pod="openshift-marketplace/redhat-marketplace-4bmzc" Dec 16 14:06:28 crc kubenswrapper[4757]: I1216 14:06:28.001262 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c3ada82-6905-403c-ab19-78623d1d0c93-utilities\") pod \"redhat-marketplace-4bmzc\" (UID: \"8c3ada82-6905-403c-ab19-78623d1d0c93\") " pod="openshift-marketplace/redhat-marketplace-4bmzc" Dec 16 14:06:28 crc kubenswrapper[4757]: I1216 14:06:28.001696 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c3ada82-6905-403c-ab19-78623d1d0c93-catalog-content\") pod \"redhat-marketplace-4bmzc\" (UID: \"8c3ada82-6905-403c-ab19-78623d1d0c93\") " pod="openshift-marketplace/redhat-marketplace-4bmzc" Dec 16 14:06:28 crc kubenswrapper[4757]: I1216 14:06:28.001730 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c3ada82-6905-403c-ab19-78623d1d0c93-utilities\") pod \"redhat-marketplace-4bmzc\" (UID: \"8c3ada82-6905-403c-ab19-78623d1d0c93\") " pod="openshift-marketplace/redhat-marketplace-4bmzc" Dec 16 14:06:28 crc kubenswrapper[4757]: I1216 14:06:28.027418 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8fv6\" (UniqueName: \"kubernetes.io/projected/8c3ada82-6905-403c-ab19-78623d1d0c93-kube-api-access-v8fv6\") pod \"redhat-marketplace-4bmzc\" (UID: \"8c3ada82-6905-403c-ab19-78623d1d0c93\") " pod="openshift-marketplace/redhat-marketplace-4bmzc" Dec 16 14:06:28 crc kubenswrapper[4757]: I1216 14:06:28.121382 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4bmzc" Dec 16 14:06:28 crc kubenswrapper[4757]: I1216 14:06:28.662036 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4bmzc"] Dec 16 14:06:29 crc kubenswrapper[4757]: I1216 14:06:29.604884 4757 generic.go:334] "Generic (PLEG): container finished" podID="8c3ada82-6905-403c-ab19-78623d1d0c93" containerID="dedbbd812b9fac63db63fd1cd47576a311c024e9f0a68899361eb7ab0b4fb676" exitCode=0 Dec 16 14:06:29 crc kubenswrapper[4757]: I1216 14:06:29.604935 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bmzc" event={"ID":"8c3ada82-6905-403c-ab19-78623d1d0c93","Type":"ContainerDied","Data":"dedbbd812b9fac63db63fd1cd47576a311c024e9f0a68899361eb7ab0b4fb676"} Dec 16 14:06:29 crc kubenswrapper[4757]: I1216 14:06:29.605352 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bmzc" event={"ID":"8c3ada82-6905-403c-ab19-78623d1d0c93","Type":"ContainerStarted","Data":"9697d64843296a18623ca98715e1aa0d1523a17bd7de948df171324538623c12"} Dec 16 14:06:31 crc kubenswrapper[4757]: I1216 14:06:31.626922 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bmzc" event={"ID":"8c3ada82-6905-403c-ab19-78623d1d0c93","Type":"ContainerStarted","Data":"36574483502e095449c0e801acbc1402e34c5a4ebc6dbe1e8b56c170285e33dc"} Dec 16 14:06:32 crc kubenswrapper[4757]: I1216 14:06:32.637634 4757 generic.go:334] "Generic (PLEG): container finished" podID="8c3ada82-6905-403c-ab19-78623d1d0c93" containerID="36574483502e095449c0e801acbc1402e34c5a4ebc6dbe1e8b56c170285e33dc" exitCode=0 Dec 16 14:06:32 crc kubenswrapper[4757]: I1216 14:06:32.638026 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bmzc" event={"ID":"8c3ada82-6905-403c-ab19-78623d1d0c93","Type":"ContainerDied","Data":"36574483502e095449c0e801acbc1402e34c5a4ebc6dbe1e8b56c170285e33dc"} Dec 16 14:06:33 crc kubenswrapper[4757]: I1216 14:06:33.700256 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7mrgw" podUID="a3661e80-939f-4c44-81bc-10cb1fa010c6" containerName="registry-server" probeResult="failure" output=< Dec 16 14:06:33 crc kubenswrapper[4757]: timeout: failed to connect service ":50051" within 1s Dec 16 14:06:33 crc kubenswrapper[4757]: > Dec 16 14:06:34 crc kubenswrapper[4757]: I1216 14:06:34.658810 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bmzc" event={"ID":"8c3ada82-6905-403c-ab19-78623d1d0c93","Type":"ContainerStarted","Data":"21465c88779150096a55f73659ca5ca835a7744e8a82b53ffde97557394d9eb3"} Dec 16 14:06:34 crc kubenswrapper[4757]: I1216 14:06:34.695172 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4bmzc" podStartSLOduration=3.813716507 podStartE2EDuration="7.695149224s" podCreationTimestamp="2025-12-16 14:06:27 +0000 UTC" firstStartedPulling="2025-12-16 14:06:29.60686407 +0000 UTC m=+4775.034607866" lastFinishedPulling="2025-12-16 14:06:33.488296777 +0000 UTC m=+4778.916040583" observedRunningTime="2025-12-16 14:06:34.686355179 +0000 UTC m=+4780.114098975" watchObservedRunningTime="2025-12-16 14:06:34.695149224 +0000 UTC m=+4780.122893020" Dec 16 14:06:38 crc kubenswrapper[4757]: I1216 14:06:38.122514 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4bmzc" Dec 16 14:06:38 crc kubenswrapper[4757]: I1216 14:06:38.123146 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4bmzc" Dec 16 14:06:38 crc kubenswrapper[4757]: I1216 14:06:38.260782 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4bmzc" Dec 16 14:06:38 crc kubenswrapper[4757]: I1216 14:06:38.764527 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4bmzc" Dec 16 14:06:38 crc kubenswrapper[4757]: I1216 14:06:38.844508 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4bmzc"] Dec 16 14:06:40 crc kubenswrapper[4757]: I1216 14:06:40.706715 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4bmzc" podUID="8c3ada82-6905-403c-ab19-78623d1d0c93" containerName="registry-server" containerID="cri-o://21465c88779150096a55f73659ca5ca835a7744e8a82b53ffde97557394d9eb3" gracePeriod=2 Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.307630 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4bmzc" Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.449497 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c3ada82-6905-403c-ab19-78623d1d0c93-catalog-content\") pod \"8c3ada82-6905-403c-ab19-78623d1d0c93\" (UID: \"8c3ada82-6905-403c-ab19-78623d1d0c93\") " Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.449624 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c3ada82-6905-403c-ab19-78623d1d0c93-utilities\") pod \"8c3ada82-6905-403c-ab19-78623d1d0c93\" (UID: \"8c3ada82-6905-403c-ab19-78623d1d0c93\") " Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.449752 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8fv6\" (UniqueName: \"kubernetes.io/projected/8c3ada82-6905-403c-ab19-78623d1d0c93-kube-api-access-v8fv6\") pod \"8c3ada82-6905-403c-ab19-78623d1d0c93\" (UID: \"8c3ada82-6905-403c-ab19-78623d1d0c93\") " Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.451076 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c3ada82-6905-403c-ab19-78623d1d0c93-utilities" (OuterVolumeSpecName: "utilities") pod "8c3ada82-6905-403c-ab19-78623d1d0c93" (UID: "8c3ada82-6905-403c-ab19-78623d1d0c93"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.462941 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c3ada82-6905-403c-ab19-78623d1d0c93-kube-api-access-v8fv6" (OuterVolumeSpecName: "kube-api-access-v8fv6") pod "8c3ada82-6905-403c-ab19-78623d1d0c93" (UID: "8c3ada82-6905-403c-ab19-78623d1d0c93"). InnerVolumeSpecName "kube-api-access-v8fv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.472352 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c3ada82-6905-403c-ab19-78623d1d0c93-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c3ada82-6905-403c-ab19-78623d1d0c93" (UID: "8c3ada82-6905-403c-ab19-78623d1d0c93"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.552603 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c3ada82-6905-403c-ab19-78623d1d0c93-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.552847 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c3ada82-6905-403c-ab19-78623d1d0c93-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.552880 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8fv6\" (UniqueName: \"kubernetes.io/projected/8c3ada82-6905-403c-ab19-78623d1d0c93-kube-api-access-v8fv6\") on node \"crc\" DevicePath \"\"" Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.717981 4757 generic.go:334] "Generic (PLEG): container finished" podID="8c3ada82-6905-403c-ab19-78623d1d0c93" containerID="21465c88779150096a55f73659ca5ca835a7744e8a82b53ffde97557394d9eb3" exitCode=0 Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.718048 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bmzc" event={"ID":"8c3ada82-6905-403c-ab19-78623d1d0c93","Type":"ContainerDied","Data":"21465c88779150096a55f73659ca5ca835a7744e8a82b53ffde97557394d9eb3"} Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.718068 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4bmzc" Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.718093 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bmzc" event={"ID":"8c3ada82-6905-403c-ab19-78623d1d0c93","Type":"ContainerDied","Data":"9697d64843296a18623ca98715e1aa0d1523a17bd7de948df171324538623c12"} Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.718114 4757 scope.go:117] "RemoveContainer" containerID="21465c88779150096a55f73659ca5ca835a7744e8a82b53ffde97557394d9eb3" Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.738451 4757 scope.go:117] "RemoveContainer" containerID="36574483502e095449c0e801acbc1402e34c5a4ebc6dbe1e8b56c170285e33dc" Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.760035 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4bmzc"] Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.771326 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4bmzc"] Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.779968 4757 scope.go:117] "RemoveContainer" containerID="dedbbd812b9fac63db63fd1cd47576a311c024e9f0a68899361eb7ab0b4fb676" Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.819996 4757 scope.go:117] "RemoveContainer" containerID="21465c88779150096a55f73659ca5ca835a7744e8a82b53ffde97557394d9eb3" Dec 16 14:06:41 crc kubenswrapper[4757]: E1216 14:06:41.820685 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21465c88779150096a55f73659ca5ca835a7744e8a82b53ffde97557394d9eb3\": container with ID starting with 21465c88779150096a55f73659ca5ca835a7744e8a82b53ffde97557394d9eb3 not found: ID does not exist" containerID="21465c88779150096a55f73659ca5ca835a7744e8a82b53ffde97557394d9eb3" Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.820760 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21465c88779150096a55f73659ca5ca835a7744e8a82b53ffde97557394d9eb3"} err="failed to get container status \"21465c88779150096a55f73659ca5ca835a7744e8a82b53ffde97557394d9eb3\": rpc error: code = NotFound desc = could not find container \"21465c88779150096a55f73659ca5ca835a7744e8a82b53ffde97557394d9eb3\": container with ID starting with 21465c88779150096a55f73659ca5ca835a7744e8a82b53ffde97557394d9eb3 not found: ID does not exist" Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.820794 4757 scope.go:117] "RemoveContainer" containerID="36574483502e095449c0e801acbc1402e34c5a4ebc6dbe1e8b56c170285e33dc" Dec 16 14:06:41 crc kubenswrapper[4757]: E1216 14:06:41.821177 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36574483502e095449c0e801acbc1402e34c5a4ebc6dbe1e8b56c170285e33dc\": container with ID starting with 36574483502e095449c0e801acbc1402e34c5a4ebc6dbe1e8b56c170285e33dc not found: ID does not exist" containerID="36574483502e095449c0e801acbc1402e34c5a4ebc6dbe1e8b56c170285e33dc" Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.821209 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36574483502e095449c0e801acbc1402e34c5a4ebc6dbe1e8b56c170285e33dc"} err="failed to get container status \"36574483502e095449c0e801acbc1402e34c5a4ebc6dbe1e8b56c170285e33dc\": rpc error: code = NotFound desc = could not find container \"36574483502e095449c0e801acbc1402e34c5a4ebc6dbe1e8b56c170285e33dc\": container with ID starting with 36574483502e095449c0e801acbc1402e34c5a4ebc6dbe1e8b56c170285e33dc not found: ID does not exist" Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.821230 4757 scope.go:117] "RemoveContainer" containerID="dedbbd812b9fac63db63fd1cd47576a311c024e9f0a68899361eb7ab0b4fb676" Dec 16 14:06:41 crc kubenswrapper[4757]: E1216 14:06:41.821516 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dedbbd812b9fac63db63fd1cd47576a311c024e9f0a68899361eb7ab0b4fb676\": container with ID starting with dedbbd812b9fac63db63fd1cd47576a311c024e9f0a68899361eb7ab0b4fb676 not found: ID does not exist" containerID="dedbbd812b9fac63db63fd1cd47576a311c024e9f0a68899361eb7ab0b4fb676" Dec 16 14:06:41 crc kubenswrapper[4757]: I1216 14:06:41.821545 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dedbbd812b9fac63db63fd1cd47576a311c024e9f0a68899361eb7ab0b4fb676"} err="failed to get container status \"dedbbd812b9fac63db63fd1cd47576a311c024e9f0a68899361eb7ab0b4fb676\": rpc error: code = NotFound desc = could not find container \"dedbbd812b9fac63db63fd1cd47576a311c024e9f0a68899361eb7ab0b4fb676\": container with ID starting with dedbbd812b9fac63db63fd1cd47576a311c024e9f0a68899361eb7ab0b4fb676 not found: ID does not exist" Dec 16 14:06:42 crc kubenswrapper[4757]: I1216 14:06:42.701151 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7mrgw" Dec 16 14:06:42 crc kubenswrapper[4757]: I1216 14:06:42.757975 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7mrgw" Dec 16 14:06:42 crc kubenswrapper[4757]: I1216 14:06:42.960707 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c3ada82-6905-403c-ab19-78623d1d0c93" path="/var/lib/kubelet/pods/8c3ada82-6905-403c-ab19-78623d1d0c93/volumes" Dec 16 14:06:43 crc kubenswrapper[4757]: I1216 14:06:43.898459 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7mrgw"] Dec 16 14:06:43 crc kubenswrapper[4757]: I1216 14:06:43.899026 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7mrgw" podUID="a3661e80-939f-4c44-81bc-10cb1fa010c6" containerName="registry-server" containerID="cri-o://48867fe4b6b71412f244291b73f4cb6c4a26a1dbb75f1cb34cf5563b516b0ae9" gracePeriod=2 Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.454455 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7mrgw" Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.605529 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3661e80-939f-4c44-81bc-10cb1fa010c6-catalog-content\") pod \"a3661e80-939f-4c44-81bc-10cb1fa010c6\" (UID: \"a3661e80-939f-4c44-81bc-10cb1fa010c6\") " Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.605639 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3661e80-939f-4c44-81bc-10cb1fa010c6-utilities\") pod \"a3661e80-939f-4c44-81bc-10cb1fa010c6\" (UID: \"a3661e80-939f-4c44-81bc-10cb1fa010c6\") " Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.605738 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcbg5\" (UniqueName: \"kubernetes.io/projected/a3661e80-939f-4c44-81bc-10cb1fa010c6-kube-api-access-gcbg5\") pod \"a3661e80-939f-4c44-81bc-10cb1fa010c6\" (UID: \"a3661e80-939f-4c44-81bc-10cb1fa010c6\") " Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.606428 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3661e80-939f-4c44-81bc-10cb1fa010c6-utilities" (OuterVolumeSpecName: "utilities") pod "a3661e80-939f-4c44-81bc-10cb1fa010c6" (UID: "a3661e80-939f-4c44-81bc-10cb1fa010c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.612834 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3661e80-939f-4c44-81bc-10cb1fa010c6-kube-api-access-gcbg5" (OuterVolumeSpecName: "kube-api-access-gcbg5") pod "a3661e80-939f-4c44-81bc-10cb1fa010c6" (UID: "a3661e80-939f-4c44-81bc-10cb1fa010c6"). InnerVolumeSpecName "kube-api-access-gcbg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.707868 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3661e80-939f-4c44-81bc-10cb1fa010c6-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.707906 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcbg5\" (UniqueName: \"kubernetes.io/projected/a3661e80-939f-4c44-81bc-10cb1fa010c6-kube-api-access-gcbg5\") on node \"crc\" DevicePath \"\"" Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.732553 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3661e80-939f-4c44-81bc-10cb1fa010c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3661e80-939f-4c44-81bc-10cb1fa010c6" (UID: "a3661e80-939f-4c44-81bc-10cb1fa010c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.758583 4757 generic.go:334] "Generic (PLEG): container finished" podID="a3661e80-939f-4c44-81bc-10cb1fa010c6" containerID="48867fe4b6b71412f244291b73f4cb6c4a26a1dbb75f1cb34cf5563b516b0ae9" exitCode=0 Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.758647 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7mrgw" Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.758677 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mrgw" event={"ID":"a3661e80-939f-4c44-81bc-10cb1fa010c6","Type":"ContainerDied","Data":"48867fe4b6b71412f244291b73f4cb6c4a26a1dbb75f1cb34cf5563b516b0ae9"} Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.759053 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mrgw" event={"ID":"a3661e80-939f-4c44-81bc-10cb1fa010c6","Type":"ContainerDied","Data":"8bf83366b1f7cd69142f9424e93fbf69b8297dd0644016da73b9325839cf2974"} Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.759072 4757 scope.go:117] "RemoveContainer" containerID="48867fe4b6b71412f244291b73f4cb6c4a26a1dbb75f1cb34cf5563b516b0ae9" Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.799823 4757 scope.go:117] "RemoveContainer" containerID="71b2c2a36120dfeff26a6932cf1ae167a496f5e6ec5390ff9c7b3395e0ab8f9e" Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.816453 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3661e80-939f-4c44-81bc-10cb1fa010c6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.821453 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7mrgw"] Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.846888 4757 scope.go:117] "RemoveContainer" containerID="548de378cd1209251e321020d217bb1117e1c34fa8b7f3c3f1e5db547f5c8b7c" Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.850606 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7mrgw"] Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.882504 4757 scope.go:117] "RemoveContainer" containerID="48867fe4b6b71412f244291b73f4cb6c4a26a1dbb75f1cb34cf5563b516b0ae9" Dec 16 14:06:44 crc kubenswrapper[4757]: E1216 14:06:44.883380 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48867fe4b6b71412f244291b73f4cb6c4a26a1dbb75f1cb34cf5563b516b0ae9\": container with ID starting with 48867fe4b6b71412f244291b73f4cb6c4a26a1dbb75f1cb34cf5563b516b0ae9 not found: ID does not exist" containerID="48867fe4b6b71412f244291b73f4cb6c4a26a1dbb75f1cb34cf5563b516b0ae9" Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.883472 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48867fe4b6b71412f244291b73f4cb6c4a26a1dbb75f1cb34cf5563b516b0ae9"} err="failed to get container status \"48867fe4b6b71412f244291b73f4cb6c4a26a1dbb75f1cb34cf5563b516b0ae9\": rpc error: code = NotFound desc = could not find container \"48867fe4b6b71412f244291b73f4cb6c4a26a1dbb75f1cb34cf5563b516b0ae9\": container with ID starting with 48867fe4b6b71412f244291b73f4cb6c4a26a1dbb75f1cb34cf5563b516b0ae9 not found: ID does not exist" Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.883494 4757 scope.go:117] "RemoveContainer" containerID="71b2c2a36120dfeff26a6932cf1ae167a496f5e6ec5390ff9c7b3395e0ab8f9e" Dec 16 14:06:44 crc kubenswrapper[4757]: E1216 14:06:44.883842 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71b2c2a36120dfeff26a6932cf1ae167a496f5e6ec5390ff9c7b3395e0ab8f9e\": container with ID starting with 71b2c2a36120dfeff26a6932cf1ae167a496f5e6ec5390ff9c7b3395e0ab8f9e not found: ID does not exist" containerID="71b2c2a36120dfeff26a6932cf1ae167a496f5e6ec5390ff9c7b3395e0ab8f9e" Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.883866 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71b2c2a36120dfeff26a6932cf1ae167a496f5e6ec5390ff9c7b3395e0ab8f9e"} err="failed to get container status \"71b2c2a36120dfeff26a6932cf1ae167a496f5e6ec5390ff9c7b3395e0ab8f9e\": rpc error: code = NotFound desc = could not find container \"71b2c2a36120dfeff26a6932cf1ae167a496f5e6ec5390ff9c7b3395e0ab8f9e\": container with ID starting with 71b2c2a36120dfeff26a6932cf1ae167a496f5e6ec5390ff9c7b3395e0ab8f9e not found: ID does not exist" Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.883881 4757 scope.go:117] "RemoveContainer" containerID="548de378cd1209251e321020d217bb1117e1c34fa8b7f3c3f1e5db547f5c8b7c" Dec 16 14:06:44 crc kubenswrapper[4757]: E1216 14:06:44.884101 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"548de378cd1209251e321020d217bb1117e1c34fa8b7f3c3f1e5db547f5c8b7c\": container with ID starting with 548de378cd1209251e321020d217bb1117e1c34fa8b7f3c3f1e5db547f5c8b7c not found: ID does not exist" containerID="548de378cd1209251e321020d217bb1117e1c34fa8b7f3c3f1e5db547f5c8b7c" Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.884117 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"548de378cd1209251e321020d217bb1117e1c34fa8b7f3c3f1e5db547f5c8b7c"} err="failed to get container status \"548de378cd1209251e321020d217bb1117e1c34fa8b7f3c3f1e5db547f5c8b7c\": rpc error: code = NotFound desc = could not find container \"548de378cd1209251e321020d217bb1117e1c34fa8b7f3c3f1e5db547f5c8b7c\": container with ID starting with 548de378cd1209251e321020d217bb1117e1c34fa8b7f3c3f1e5db547f5c8b7c not found: ID does not exist" Dec 16 14:06:44 crc kubenswrapper[4757]: E1216 14:06:44.924602 4757 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3661e80_939f_4c44_81bc_10cb1fa010c6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3661e80_939f_4c44_81bc_10cb1fa010c6.slice/crio-8bf83366b1f7cd69142f9424e93fbf69b8297dd0644016da73b9325839cf2974\": RecentStats: unable to find data in memory cache]" Dec 16 14:06:44 crc kubenswrapper[4757]: I1216 14:06:44.961663 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3661e80-939f-4c44-81bc-10cb1fa010c6" path="/var/lib/kubelet/pods/a3661e80-939f-4c44-81bc-10cb1fa010c6/volumes" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.109055 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9w8nh"] Dec 16 14:06:47 crc kubenswrapper[4757]: E1216 14:06:47.109660 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c3ada82-6905-403c-ab19-78623d1d0c93" containerName="extract-utilities" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.109675 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c3ada82-6905-403c-ab19-78623d1d0c93" containerName="extract-utilities" Dec 16 14:06:47 crc kubenswrapper[4757]: E1216 14:06:47.109703 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c3ada82-6905-403c-ab19-78623d1d0c93" containerName="extract-content" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.109709 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c3ada82-6905-403c-ab19-78623d1d0c93" containerName="extract-content" Dec 16 14:06:47 crc kubenswrapper[4757]: E1216 14:06:47.109724 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3661e80-939f-4c44-81bc-10cb1fa010c6" containerName="registry-server" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.109731 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3661e80-939f-4c44-81bc-10cb1fa010c6" containerName="registry-server" Dec 16 14:06:47 crc kubenswrapper[4757]: E1216 14:06:47.109740 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3661e80-939f-4c44-81bc-10cb1fa010c6" containerName="extract-utilities" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.109746 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3661e80-939f-4c44-81bc-10cb1fa010c6" containerName="extract-utilities" Dec 16 14:06:47 crc kubenswrapper[4757]: E1216 14:06:47.109756 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c3ada82-6905-403c-ab19-78623d1d0c93" containerName="registry-server" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.109762 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c3ada82-6905-403c-ab19-78623d1d0c93" containerName="registry-server" Dec 16 14:06:47 crc kubenswrapper[4757]: E1216 14:06:47.109774 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3661e80-939f-4c44-81bc-10cb1fa010c6" containerName="extract-content" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.109780 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3661e80-939f-4c44-81bc-10cb1fa010c6" containerName="extract-content" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.109959 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3661e80-939f-4c44-81bc-10cb1fa010c6" containerName="registry-server" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.109973 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c3ada82-6905-403c-ab19-78623d1d0c93" containerName="registry-server" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.111239 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w8nh" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.130248 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9w8nh"] Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.261069 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04e2ce19-e49c-4f1d-9415-b4cf61167069-utilities\") pod \"community-operators-9w8nh\" (UID: \"04e2ce19-e49c-4f1d-9415-b4cf61167069\") " pod="openshift-marketplace/community-operators-9w8nh" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.261151 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04e2ce19-e49c-4f1d-9415-b4cf61167069-catalog-content\") pod \"community-operators-9w8nh\" (UID: \"04e2ce19-e49c-4f1d-9415-b4cf61167069\") " pod="openshift-marketplace/community-operators-9w8nh" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.261439 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzg9b\" (UniqueName: \"kubernetes.io/projected/04e2ce19-e49c-4f1d-9415-b4cf61167069-kube-api-access-hzg9b\") pod \"community-operators-9w8nh\" (UID: \"04e2ce19-e49c-4f1d-9415-b4cf61167069\") " pod="openshift-marketplace/community-operators-9w8nh" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.362149 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzg9b\" (UniqueName: \"kubernetes.io/projected/04e2ce19-e49c-4f1d-9415-b4cf61167069-kube-api-access-hzg9b\") pod \"community-operators-9w8nh\" (UID: \"04e2ce19-e49c-4f1d-9415-b4cf61167069\") " pod="openshift-marketplace/community-operators-9w8nh" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.362249 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04e2ce19-e49c-4f1d-9415-b4cf61167069-utilities\") pod \"community-operators-9w8nh\" (UID: \"04e2ce19-e49c-4f1d-9415-b4cf61167069\") " pod="openshift-marketplace/community-operators-9w8nh" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.362276 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04e2ce19-e49c-4f1d-9415-b4cf61167069-catalog-content\") pod \"community-operators-9w8nh\" (UID: \"04e2ce19-e49c-4f1d-9415-b4cf61167069\") " pod="openshift-marketplace/community-operators-9w8nh" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.363086 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04e2ce19-e49c-4f1d-9415-b4cf61167069-catalog-content\") pod \"community-operators-9w8nh\" (UID: \"04e2ce19-e49c-4f1d-9415-b4cf61167069\") " pod="openshift-marketplace/community-operators-9w8nh" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.363207 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04e2ce19-e49c-4f1d-9415-b4cf61167069-utilities\") pod \"community-operators-9w8nh\" (UID: \"04e2ce19-e49c-4f1d-9415-b4cf61167069\") " pod="openshift-marketplace/community-operators-9w8nh" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.393696 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzg9b\" (UniqueName: \"kubernetes.io/projected/04e2ce19-e49c-4f1d-9415-b4cf61167069-kube-api-access-hzg9b\") pod \"community-operators-9w8nh\" (UID: \"04e2ce19-e49c-4f1d-9415-b4cf61167069\") " pod="openshift-marketplace/community-operators-9w8nh" Dec 16 14:06:47 crc kubenswrapper[4757]: I1216 14:06:47.439804 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w8nh" Dec 16 14:06:48 crc kubenswrapper[4757]: I1216 14:06:48.114097 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9w8nh"] Dec 16 14:06:48 crc kubenswrapper[4757]: I1216 14:06:48.820402 4757 generic.go:334] "Generic (PLEG): container finished" podID="04e2ce19-e49c-4f1d-9415-b4cf61167069" containerID="d6f00e560ea53b55669514e6e6f0deeab60deb3a00db578ab5045b1328b4b0a9" exitCode=0 Dec 16 14:06:48 crc kubenswrapper[4757]: I1216 14:06:48.820463 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w8nh" event={"ID":"04e2ce19-e49c-4f1d-9415-b4cf61167069","Type":"ContainerDied","Data":"d6f00e560ea53b55669514e6e6f0deeab60deb3a00db578ab5045b1328b4b0a9"} Dec 16 14:06:48 crc kubenswrapper[4757]: I1216 14:06:48.821035 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w8nh" event={"ID":"04e2ce19-e49c-4f1d-9415-b4cf61167069","Type":"ContainerStarted","Data":"c83b5cb5ea8a86b374fcbd2424cf4ae1dfbd8d23459f2b617298072f5b5919bf"} Dec 16 14:06:50 crc kubenswrapper[4757]: I1216 14:06:50.841043 4757 generic.go:334] "Generic (PLEG): container finished" podID="04e2ce19-e49c-4f1d-9415-b4cf61167069" containerID="c0a8543d1d27d1dd2bb81ffc426b3aa6b6740aa2eef9ca379a11e1315e6f42c8" exitCode=0 Dec 16 14:06:50 crc kubenswrapper[4757]: I1216 14:06:50.841156 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w8nh" event={"ID":"04e2ce19-e49c-4f1d-9415-b4cf61167069","Type":"ContainerDied","Data":"c0a8543d1d27d1dd2bb81ffc426b3aa6b6740aa2eef9ca379a11e1315e6f42c8"} Dec 16 14:06:51 crc kubenswrapper[4757]: I1216 14:06:51.854294 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w8nh" event={"ID":"04e2ce19-e49c-4f1d-9415-b4cf61167069","Type":"ContainerStarted","Data":"4fec659f52690fc3dee8a57c4a2eddffcae6ba92538f9c526b80cb234292d492"} Dec 16 14:06:51 crc kubenswrapper[4757]: I1216 14:06:51.885028 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9w8nh" podStartSLOduration=2.3911864290000002 podStartE2EDuration="4.884994486s" podCreationTimestamp="2025-12-16 14:06:47 +0000 UTC" firstStartedPulling="2025-12-16 14:06:48.822402641 +0000 UTC m=+4794.250146437" lastFinishedPulling="2025-12-16 14:06:51.316210698 +0000 UTC m=+4796.743954494" observedRunningTime="2025-12-16 14:06:51.879522882 +0000 UTC m=+4797.307266678" watchObservedRunningTime="2025-12-16 14:06:51.884994486 +0000 UTC m=+4797.312738282" Dec 16 14:06:57 crc kubenswrapper[4757]: I1216 14:06:57.441590 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9w8nh" Dec 16 14:06:57 crc kubenswrapper[4757]: I1216 14:06:57.442246 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9w8nh" Dec 16 14:06:57 crc kubenswrapper[4757]: I1216 14:06:57.496550 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9w8nh" Dec 16 14:06:57 crc kubenswrapper[4757]: I1216 14:06:57.967345 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9w8nh" Dec 16 14:06:58 crc kubenswrapper[4757]: I1216 14:06:58.748194 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9w8nh"] Dec 16 14:06:59 crc kubenswrapper[4757]: I1216 14:06:59.937744 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9w8nh" podUID="04e2ce19-e49c-4f1d-9415-b4cf61167069" containerName="registry-server" containerID="cri-o://4fec659f52690fc3dee8a57c4a2eddffcae6ba92538f9c526b80cb234292d492" gracePeriod=2 Dec 16 14:07:00 crc kubenswrapper[4757]: I1216 14:07:00.534509 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w8nh" Dec 16 14:07:00 crc kubenswrapper[4757]: I1216 14:07:00.665440 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04e2ce19-e49c-4f1d-9415-b4cf61167069-utilities\") pod \"04e2ce19-e49c-4f1d-9415-b4cf61167069\" (UID: \"04e2ce19-e49c-4f1d-9415-b4cf61167069\") " Dec 16 14:07:00 crc kubenswrapper[4757]: I1216 14:07:00.665751 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04e2ce19-e49c-4f1d-9415-b4cf61167069-catalog-content\") pod \"04e2ce19-e49c-4f1d-9415-b4cf61167069\" (UID: \"04e2ce19-e49c-4f1d-9415-b4cf61167069\") " Dec 16 14:07:00 crc kubenswrapper[4757]: I1216 14:07:00.665972 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzg9b\" (UniqueName: \"kubernetes.io/projected/04e2ce19-e49c-4f1d-9415-b4cf61167069-kube-api-access-hzg9b\") pod \"04e2ce19-e49c-4f1d-9415-b4cf61167069\" (UID: \"04e2ce19-e49c-4f1d-9415-b4cf61167069\") " Dec 16 14:07:00 crc kubenswrapper[4757]: I1216 14:07:00.672942 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04e2ce19-e49c-4f1d-9415-b4cf61167069-kube-api-access-hzg9b" (OuterVolumeSpecName: "kube-api-access-hzg9b") pod "04e2ce19-e49c-4f1d-9415-b4cf61167069" (UID: "04e2ce19-e49c-4f1d-9415-b4cf61167069"). InnerVolumeSpecName "kube-api-access-hzg9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:07:00 crc kubenswrapper[4757]: I1216 14:07:00.687658 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04e2ce19-e49c-4f1d-9415-b4cf61167069-utilities" (OuterVolumeSpecName: "utilities") pod "04e2ce19-e49c-4f1d-9415-b4cf61167069" (UID: "04e2ce19-e49c-4f1d-9415-b4cf61167069"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:07:00 crc kubenswrapper[4757]: I1216 14:07:00.749212 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04e2ce19-e49c-4f1d-9415-b4cf61167069-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04e2ce19-e49c-4f1d-9415-b4cf61167069" (UID: "04e2ce19-e49c-4f1d-9415-b4cf61167069"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:07:00 crc kubenswrapper[4757]: I1216 14:07:00.767555 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzg9b\" (UniqueName: \"kubernetes.io/projected/04e2ce19-e49c-4f1d-9415-b4cf61167069-kube-api-access-hzg9b\") on node \"crc\" DevicePath \"\"" Dec 16 14:07:00 crc kubenswrapper[4757]: I1216 14:07:00.767584 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04e2ce19-e49c-4f1d-9415-b4cf61167069-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:07:00 crc kubenswrapper[4757]: I1216 14:07:00.767595 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04e2ce19-e49c-4f1d-9415-b4cf61167069-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:07:00 crc kubenswrapper[4757]: I1216 14:07:00.953927 4757 generic.go:334] "Generic (PLEG): container finished" podID="04e2ce19-e49c-4f1d-9415-b4cf61167069" containerID="4fec659f52690fc3dee8a57c4a2eddffcae6ba92538f9c526b80cb234292d492" exitCode=0 Dec 16 14:07:00 crc kubenswrapper[4757]: I1216 14:07:00.954032 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w8nh" Dec 16 14:07:00 crc kubenswrapper[4757]: I1216 14:07:00.963126 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w8nh" event={"ID":"04e2ce19-e49c-4f1d-9415-b4cf61167069","Type":"ContainerDied","Data":"4fec659f52690fc3dee8a57c4a2eddffcae6ba92538f9c526b80cb234292d492"} Dec 16 14:07:00 crc kubenswrapper[4757]: I1216 14:07:00.963174 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w8nh" event={"ID":"04e2ce19-e49c-4f1d-9415-b4cf61167069","Type":"ContainerDied","Data":"c83b5cb5ea8a86b374fcbd2424cf4ae1dfbd8d23459f2b617298072f5b5919bf"} Dec 16 14:07:00 crc kubenswrapper[4757]: I1216 14:07:00.963197 4757 scope.go:117] "RemoveContainer" containerID="4fec659f52690fc3dee8a57c4a2eddffcae6ba92538f9c526b80cb234292d492" Dec 16 14:07:01 crc kubenswrapper[4757]: I1216 14:07:01.003571 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9w8nh"] Dec 16 14:07:01 crc kubenswrapper[4757]: I1216 14:07:01.016487 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9w8nh"] Dec 16 14:07:01 crc kubenswrapper[4757]: I1216 14:07:01.022592 4757 scope.go:117] "RemoveContainer" containerID="c0a8543d1d27d1dd2bb81ffc426b3aa6b6740aa2eef9ca379a11e1315e6f42c8" Dec 16 14:07:01 crc kubenswrapper[4757]: I1216 14:07:01.086194 4757 scope.go:117] "RemoveContainer" containerID="d6f00e560ea53b55669514e6e6f0deeab60deb3a00db578ab5045b1328b4b0a9" Dec 16 14:07:01 crc kubenswrapper[4757]: I1216 14:07:01.130199 4757 scope.go:117] "RemoveContainer" containerID="4fec659f52690fc3dee8a57c4a2eddffcae6ba92538f9c526b80cb234292d492" Dec 16 14:07:01 crc kubenswrapper[4757]: E1216 14:07:01.133265 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fec659f52690fc3dee8a57c4a2eddffcae6ba92538f9c526b80cb234292d492\": container with ID starting with 4fec659f52690fc3dee8a57c4a2eddffcae6ba92538f9c526b80cb234292d492 not found: ID does not exist" containerID="4fec659f52690fc3dee8a57c4a2eddffcae6ba92538f9c526b80cb234292d492" Dec 16 14:07:01 crc kubenswrapper[4757]: I1216 14:07:01.133318 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fec659f52690fc3dee8a57c4a2eddffcae6ba92538f9c526b80cb234292d492"} err="failed to get container status \"4fec659f52690fc3dee8a57c4a2eddffcae6ba92538f9c526b80cb234292d492\": rpc error: code = NotFound desc = could not find container \"4fec659f52690fc3dee8a57c4a2eddffcae6ba92538f9c526b80cb234292d492\": container with ID starting with 4fec659f52690fc3dee8a57c4a2eddffcae6ba92538f9c526b80cb234292d492 not found: ID does not exist" Dec 16 14:07:01 crc kubenswrapper[4757]: I1216 14:07:01.133351 4757 scope.go:117] "RemoveContainer" containerID="c0a8543d1d27d1dd2bb81ffc426b3aa6b6740aa2eef9ca379a11e1315e6f42c8" Dec 16 14:07:01 crc kubenswrapper[4757]: E1216 14:07:01.135166 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0a8543d1d27d1dd2bb81ffc426b3aa6b6740aa2eef9ca379a11e1315e6f42c8\": container with ID starting with c0a8543d1d27d1dd2bb81ffc426b3aa6b6740aa2eef9ca379a11e1315e6f42c8 not found: ID does not exist" containerID="c0a8543d1d27d1dd2bb81ffc426b3aa6b6740aa2eef9ca379a11e1315e6f42c8" Dec 16 14:07:01 crc kubenswrapper[4757]: I1216 14:07:01.135204 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0a8543d1d27d1dd2bb81ffc426b3aa6b6740aa2eef9ca379a11e1315e6f42c8"} err="failed to get container status \"c0a8543d1d27d1dd2bb81ffc426b3aa6b6740aa2eef9ca379a11e1315e6f42c8\": rpc error: code = NotFound desc = could not find container \"c0a8543d1d27d1dd2bb81ffc426b3aa6b6740aa2eef9ca379a11e1315e6f42c8\": container with ID starting with c0a8543d1d27d1dd2bb81ffc426b3aa6b6740aa2eef9ca379a11e1315e6f42c8 not found: ID does not exist" Dec 16 14:07:01 crc kubenswrapper[4757]: I1216 14:07:01.135229 4757 scope.go:117] "RemoveContainer" containerID="d6f00e560ea53b55669514e6e6f0deeab60deb3a00db578ab5045b1328b4b0a9" Dec 16 14:07:01 crc kubenswrapper[4757]: E1216 14:07:01.137310 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6f00e560ea53b55669514e6e6f0deeab60deb3a00db578ab5045b1328b4b0a9\": container with ID starting with d6f00e560ea53b55669514e6e6f0deeab60deb3a00db578ab5045b1328b4b0a9 not found: ID does not exist" containerID="d6f00e560ea53b55669514e6e6f0deeab60deb3a00db578ab5045b1328b4b0a9" Dec 16 14:07:01 crc kubenswrapper[4757]: I1216 14:07:01.137347 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6f00e560ea53b55669514e6e6f0deeab60deb3a00db578ab5045b1328b4b0a9"} err="failed to get container status \"d6f00e560ea53b55669514e6e6f0deeab60deb3a00db578ab5045b1328b4b0a9\": rpc error: code = NotFound desc = could not find container \"d6f00e560ea53b55669514e6e6f0deeab60deb3a00db578ab5045b1328b4b0a9\": container with ID starting with d6f00e560ea53b55669514e6e6f0deeab60deb3a00db578ab5045b1328b4b0a9 not found: ID does not exist" Dec 16 14:07:02 crc kubenswrapper[4757]: I1216 14:07:02.963720 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04e2ce19-e49c-4f1d-9415-b4cf61167069" path="/var/lib/kubelet/pods/04e2ce19-e49c-4f1d-9415-b4cf61167069/volumes" Dec 16 14:08:21 crc kubenswrapper[4757]: I1216 14:08:21.181151 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 14:08:21 crc kubenswrapper[4757]: I1216 14:08:21.181777 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 14:08:28 crc kubenswrapper[4757]: I1216 14:08:28.744509 4757 generic.go:334] "Generic (PLEG): container finished" podID="d2802d44-5cd2-4f45-80b0-d423d3ab6ea8" containerID="8dd50f318ea809b86ce4b54c8f97e070e5be854c7486dfe17a7d66471b8752df" exitCode=0 Dec 16 14:08:28 crc kubenswrapper[4757]: I1216 14:08:28.744632 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8","Type":"ContainerDied","Data":"8dd50f318ea809b86ce4b54c8f97e070e5be854c7486dfe17a7d66471b8752df"} Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.106243 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.195397 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-ca-certs\") pod \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.195505 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-test-operator-ephemeral-workdir\") pod \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.195563 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-test-operator-ephemeral-temporary\") pod \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.195708 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.195766 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-openstack-config-secret\") pod \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.195822 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-ssh-key\") pod \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.195844 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-config-data\") pod \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.195910 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jsjc\" (UniqueName: \"kubernetes.io/projected/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-kube-api-access-7jsjc\") pod \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.195967 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-openstack-config\") pod \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\" (UID: \"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8\") " Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.196455 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "d2802d44-5cd2-4f45-80b0-d423d3ab6ea8" (UID: "d2802d44-5cd2-4f45-80b0-d423d3ab6ea8"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.198607 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-config-data" (OuterVolumeSpecName: "config-data") pod "d2802d44-5cd2-4f45-80b0-d423d3ab6ea8" (UID: "d2802d44-5cd2-4f45-80b0-d423d3ab6ea8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.202596 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "d2802d44-5cd2-4f45-80b0-d423d3ab6ea8" (UID: "d2802d44-5cd2-4f45-80b0-d423d3ab6ea8"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.207272 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-kube-api-access-7jsjc" (OuterVolumeSpecName: "kube-api-access-7jsjc") pod "d2802d44-5cd2-4f45-80b0-d423d3ab6ea8" (UID: "d2802d44-5cd2-4f45-80b0-d423d3ab6ea8"). InnerVolumeSpecName "kube-api-access-7jsjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.208210 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "d2802d44-5cd2-4f45-80b0-d423d3ab6ea8" (UID: "d2802d44-5cd2-4f45-80b0-d423d3ab6ea8"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.224962 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d2802d44-5cd2-4f45-80b0-d423d3ab6ea8" (UID: "d2802d44-5cd2-4f45-80b0-d423d3ab6ea8"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.233538 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d2802d44-5cd2-4f45-80b0-d423d3ab6ea8" (UID: "d2802d44-5cd2-4f45-80b0-d423d3ab6ea8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.255409 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "d2802d44-5cd2-4f45-80b0-d423d3ab6ea8" (UID: "d2802d44-5cd2-4f45-80b0-d423d3ab6ea8"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.261424 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d2802d44-5cd2-4f45-80b0-d423d3ab6ea8" (UID: "d2802d44-5cd2-4f45-80b0-d423d3ab6ea8"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.299116 4757 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.299151 4757 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.299163 4757 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.299178 4757 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.299751 4757 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.299773 4757 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.299787 4757 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.299797 4757 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.299807 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jsjc\" (UniqueName: \"kubernetes.io/projected/d2802d44-5cd2-4f45-80b0-d423d3ab6ea8-kube-api-access-7jsjc\") on node \"crc\" DevicePath \"\"" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.320088 4757 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.402505 4757 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.775561 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d2802d44-5cd2-4f45-80b0-d423d3ab6ea8","Type":"ContainerDied","Data":"35bff818b8e3b66d15c746f24a2968c478639494de5fdf3121527e5b27a4c836"} Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.775599 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35bff818b8e3b66d15c746f24a2968c478639494de5fdf3121527e5b27a4c836" Dec 16 14:08:30 crc kubenswrapper[4757]: I1216 14:08:30.776240 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 16 14:08:39 crc kubenswrapper[4757]: I1216 14:08:39.887944 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 16 14:08:39 crc kubenswrapper[4757]: E1216 14:08:39.889344 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e2ce19-e49c-4f1d-9415-b4cf61167069" containerName="extract-utilities" Dec 16 14:08:39 crc kubenswrapper[4757]: I1216 14:08:39.889364 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e2ce19-e49c-4f1d-9415-b4cf61167069" containerName="extract-utilities" Dec 16 14:08:39 crc kubenswrapper[4757]: E1216 14:08:39.889378 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e2ce19-e49c-4f1d-9415-b4cf61167069" containerName="registry-server" Dec 16 14:08:39 crc kubenswrapper[4757]: I1216 14:08:39.889387 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e2ce19-e49c-4f1d-9415-b4cf61167069" containerName="registry-server" Dec 16 14:08:39 crc kubenswrapper[4757]: E1216 14:08:39.889402 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2802d44-5cd2-4f45-80b0-d423d3ab6ea8" containerName="tempest-tests-tempest-tests-runner" Dec 16 14:08:39 crc kubenswrapper[4757]: I1216 14:08:39.889411 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2802d44-5cd2-4f45-80b0-d423d3ab6ea8" containerName="tempest-tests-tempest-tests-runner" Dec 16 14:08:39 crc kubenswrapper[4757]: E1216 14:08:39.889444 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e2ce19-e49c-4f1d-9415-b4cf61167069" containerName="extract-content" Dec 16 14:08:39 crc kubenswrapper[4757]: I1216 14:08:39.889451 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e2ce19-e49c-4f1d-9415-b4cf61167069" containerName="extract-content" Dec 16 14:08:39 crc kubenswrapper[4757]: I1216 14:08:39.889684 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="04e2ce19-e49c-4f1d-9415-b4cf61167069" containerName="registry-server" Dec 16 14:08:39 crc kubenswrapper[4757]: I1216 14:08:39.889744 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2802d44-5cd2-4f45-80b0-d423d3ab6ea8" containerName="tempest-tests-tempest-tests-runner" Dec 16 14:08:39 crc kubenswrapper[4757]: I1216 14:08:39.890569 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 14:08:39 crc kubenswrapper[4757]: I1216 14:08:39.911113 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 16 14:08:39 crc kubenswrapper[4757]: I1216 14:08:39.926986 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5zkmx" Dec 16 14:08:40 crc kubenswrapper[4757]: I1216 14:08:40.010699 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"20af979c-4f0b-44d5-946c-fa6138ee9539\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 14:08:40 crc kubenswrapper[4757]: I1216 14:08:40.010787 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgpnw\" (UniqueName: \"kubernetes.io/projected/20af979c-4f0b-44d5-946c-fa6138ee9539-kube-api-access-jgpnw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"20af979c-4f0b-44d5-946c-fa6138ee9539\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 14:08:40 crc kubenswrapper[4757]: I1216 14:08:40.119332 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgpnw\" (UniqueName: \"kubernetes.io/projected/20af979c-4f0b-44d5-946c-fa6138ee9539-kube-api-access-jgpnw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"20af979c-4f0b-44d5-946c-fa6138ee9539\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 14:08:40 crc kubenswrapper[4757]: I1216 14:08:40.119842 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"20af979c-4f0b-44d5-946c-fa6138ee9539\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 14:08:40 crc kubenswrapper[4757]: I1216 14:08:40.121584 4757 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"20af979c-4f0b-44d5-946c-fa6138ee9539\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 14:08:40 crc kubenswrapper[4757]: I1216 14:08:40.142852 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgpnw\" (UniqueName: \"kubernetes.io/projected/20af979c-4f0b-44d5-946c-fa6138ee9539-kube-api-access-jgpnw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"20af979c-4f0b-44d5-946c-fa6138ee9539\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 14:08:40 crc kubenswrapper[4757]: I1216 14:08:40.150658 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"20af979c-4f0b-44d5-946c-fa6138ee9539\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 14:08:40 crc kubenswrapper[4757]: I1216 14:08:40.259573 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 14:08:40 crc kubenswrapper[4757]: I1216 14:08:40.685382 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 16 14:08:40 crc kubenswrapper[4757]: I1216 14:08:40.689336 4757 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 14:08:40 crc kubenswrapper[4757]: I1216 14:08:40.876133 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"20af979c-4f0b-44d5-946c-fa6138ee9539","Type":"ContainerStarted","Data":"6b24ab2b1c24a5aa59d7a1df8f9926f0fc296d12a9111f63b922d93739b2ca94"} Dec 16 14:08:41 crc kubenswrapper[4757]: I1216 14:08:41.889539 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"20af979c-4f0b-44d5-946c-fa6138ee9539","Type":"ContainerStarted","Data":"e61c0be0e53fe858c995403a9616882ecbf2704c69d3ccd9b7834519f591e52a"} Dec 16 14:08:41 crc kubenswrapper[4757]: I1216 14:08:41.914276 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.903530803 podStartE2EDuration="2.91425477s" podCreationTimestamp="2025-12-16 14:08:39 +0000 UTC" firstStartedPulling="2025-12-16 14:08:40.689066597 +0000 UTC m=+4906.116810393" lastFinishedPulling="2025-12-16 14:08:41.699790564 +0000 UTC m=+4907.127534360" observedRunningTime="2025-12-16 14:08:41.903647531 +0000 UTC m=+4907.331391347" watchObservedRunningTime="2025-12-16 14:08:41.91425477 +0000 UTC m=+4907.341998576" Dec 16 14:08:51 crc kubenswrapper[4757]: I1216 14:08:51.181891 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 14:08:51 crc kubenswrapper[4757]: I1216 14:08:51.182503 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 14:09:05 crc kubenswrapper[4757]: I1216 14:09:05.537635 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gfbgg/must-gather-pw4zt"] Dec 16 14:09:05 crc kubenswrapper[4757]: I1216 14:09:05.539561 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfbgg/must-gather-pw4zt" Dec 16 14:09:05 crc kubenswrapper[4757]: I1216 14:09:05.541640 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-gfbgg"/"openshift-service-ca.crt" Dec 16 14:09:05 crc kubenswrapper[4757]: I1216 14:09:05.541694 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-gfbgg"/"kube-root-ca.crt" Dec 16 14:09:05 crc kubenswrapper[4757]: I1216 14:09:05.541812 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-gfbgg"/"default-dockercfg-jddkr" Dec 16 14:09:05 crc kubenswrapper[4757]: I1216 14:09:05.563401 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gfbgg/must-gather-pw4zt"] Dec 16 14:09:05 crc kubenswrapper[4757]: I1216 14:09:05.641431 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpgdq\" (UniqueName: \"kubernetes.io/projected/495e66e8-8d8a-43f8-a754-623e9cb354f5-kube-api-access-tpgdq\") pod \"must-gather-pw4zt\" (UID: \"495e66e8-8d8a-43f8-a754-623e9cb354f5\") " pod="openshift-must-gather-gfbgg/must-gather-pw4zt" Dec 16 14:09:05 crc kubenswrapper[4757]: I1216 14:09:05.641604 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/495e66e8-8d8a-43f8-a754-623e9cb354f5-must-gather-output\") pod \"must-gather-pw4zt\" (UID: \"495e66e8-8d8a-43f8-a754-623e9cb354f5\") " pod="openshift-must-gather-gfbgg/must-gather-pw4zt" Dec 16 14:09:05 crc kubenswrapper[4757]: I1216 14:09:05.743123 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgdq\" (UniqueName: \"kubernetes.io/projected/495e66e8-8d8a-43f8-a754-623e9cb354f5-kube-api-access-tpgdq\") pod \"must-gather-pw4zt\" (UID: \"495e66e8-8d8a-43f8-a754-623e9cb354f5\") " pod="openshift-must-gather-gfbgg/must-gather-pw4zt" Dec 16 14:09:05 crc kubenswrapper[4757]: I1216 14:09:05.743232 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/495e66e8-8d8a-43f8-a754-623e9cb354f5-must-gather-output\") pod \"must-gather-pw4zt\" (UID: \"495e66e8-8d8a-43f8-a754-623e9cb354f5\") " pod="openshift-must-gather-gfbgg/must-gather-pw4zt" Dec 16 14:09:05 crc kubenswrapper[4757]: I1216 14:09:05.743746 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/495e66e8-8d8a-43f8-a754-623e9cb354f5-must-gather-output\") pod \"must-gather-pw4zt\" (UID: \"495e66e8-8d8a-43f8-a754-623e9cb354f5\") " pod="openshift-must-gather-gfbgg/must-gather-pw4zt" Dec 16 14:09:05 crc kubenswrapper[4757]: I1216 14:09:05.772683 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpgdq\" (UniqueName: \"kubernetes.io/projected/495e66e8-8d8a-43f8-a754-623e9cb354f5-kube-api-access-tpgdq\") pod \"must-gather-pw4zt\" (UID: \"495e66e8-8d8a-43f8-a754-623e9cb354f5\") " pod="openshift-must-gather-gfbgg/must-gather-pw4zt" Dec 16 14:09:05 crc kubenswrapper[4757]: I1216 14:09:05.863334 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfbgg/must-gather-pw4zt" Dec 16 14:09:06 crc kubenswrapper[4757]: I1216 14:09:06.548568 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gfbgg/must-gather-pw4zt"] Dec 16 14:09:07 crc kubenswrapper[4757]: I1216 14:09:07.154606 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfbgg/must-gather-pw4zt" event={"ID":"495e66e8-8d8a-43f8-a754-623e9cb354f5","Type":"ContainerStarted","Data":"0577a52e496597899624d547c344260a6f0301b457114d2ccee45624043b00aa"} Dec 16 14:09:15 crc kubenswrapper[4757]: I1216 14:09:15.235446 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfbgg/must-gather-pw4zt" event={"ID":"495e66e8-8d8a-43f8-a754-623e9cb354f5","Type":"ContainerStarted","Data":"df95e352f2100f6abc6bc695b5510c29f67f3af70fc66fc59666770eb1373731"} Dec 16 14:09:15 crc kubenswrapper[4757]: I1216 14:09:15.235915 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfbgg/must-gather-pw4zt" event={"ID":"495e66e8-8d8a-43f8-a754-623e9cb354f5","Type":"ContainerStarted","Data":"4c72c9955a44cd877d092d62ce66e3bbaef2a40e668c79a3bcdd7d58e9e36f91"} Dec 16 14:09:15 crc kubenswrapper[4757]: I1216 14:09:15.256275 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gfbgg/must-gather-pw4zt" podStartSLOduration=2.9785532569999997 podStartE2EDuration="10.256257408s" podCreationTimestamp="2025-12-16 14:09:05 +0000 UTC" firstStartedPulling="2025-12-16 14:09:06.805873586 +0000 UTC m=+4932.233617382" lastFinishedPulling="2025-12-16 14:09:14.083577717 +0000 UTC m=+4939.511321533" observedRunningTime="2025-12-16 14:09:15.250898477 +0000 UTC m=+4940.678642273" watchObservedRunningTime="2025-12-16 14:09:15.256257408 +0000 UTC m=+4940.684001194" Dec 16 14:09:19 crc kubenswrapper[4757]: I1216 14:09:19.175483 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gfbgg/crc-debug-x9djf"] Dec 16 14:09:19 crc kubenswrapper[4757]: I1216 14:09:19.177592 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfbgg/crc-debug-x9djf" Dec 16 14:09:19 crc kubenswrapper[4757]: I1216 14:09:19.303288 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/490b25c3-6dc3-4c12-aeef-d9997a2542ce-host\") pod \"crc-debug-x9djf\" (UID: \"490b25c3-6dc3-4c12-aeef-d9997a2542ce\") " pod="openshift-must-gather-gfbgg/crc-debug-x9djf" Dec 16 14:09:19 crc kubenswrapper[4757]: I1216 14:09:19.304089 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt8dv\" (UniqueName: \"kubernetes.io/projected/490b25c3-6dc3-4c12-aeef-d9997a2542ce-kube-api-access-bt8dv\") pod \"crc-debug-x9djf\" (UID: \"490b25c3-6dc3-4c12-aeef-d9997a2542ce\") " pod="openshift-must-gather-gfbgg/crc-debug-x9djf" Dec 16 14:09:19 crc kubenswrapper[4757]: I1216 14:09:19.406279 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/490b25c3-6dc3-4c12-aeef-d9997a2542ce-host\") pod \"crc-debug-x9djf\" (UID: \"490b25c3-6dc3-4c12-aeef-d9997a2542ce\") " pod="openshift-must-gather-gfbgg/crc-debug-x9djf" Dec 16 14:09:19 crc kubenswrapper[4757]: I1216 14:09:19.406350 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt8dv\" (UniqueName: \"kubernetes.io/projected/490b25c3-6dc3-4c12-aeef-d9997a2542ce-kube-api-access-bt8dv\") pod \"crc-debug-x9djf\" (UID: \"490b25c3-6dc3-4c12-aeef-d9997a2542ce\") " pod="openshift-must-gather-gfbgg/crc-debug-x9djf" Dec 16 14:09:19 crc kubenswrapper[4757]: I1216 14:09:19.406456 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/490b25c3-6dc3-4c12-aeef-d9997a2542ce-host\") pod \"crc-debug-x9djf\" (UID: \"490b25c3-6dc3-4c12-aeef-d9997a2542ce\") " pod="openshift-must-gather-gfbgg/crc-debug-x9djf" Dec 16 14:09:19 crc kubenswrapper[4757]: I1216 14:09:19.425805 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt8dv\" (UniqueName: \"kubernetes.io/projected/490b25c3-6dc3-4c12-aeef-d9997a2542ce-kube-api-access-bt8dv\") pod \"crc-debug-x9djf\" (UID: \"490b25c3-6dc3-4c12-aeef-d9997a2542ce\") " pod="openshift-must-gather-gfbgg/crc-debug-x9djf" Dec 16 14:09:19 crc kubenswrapper[4757]: I1216 14:09:19.498418 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfbgg/crc-debug-x9djf" Dec 16 14:09:19 crc kubenswrapper[4757]: W1216 14:09:19.526512 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod490b25c3_6dc3_4c12_aeef_d9997a2542ce.slice/crio-1af696dfc79bc5993bc39f2ce3fbaa484f1eb906de7194f624c254309c2009f0 WatchSource:0}: Error finding container 1af696dfc79bc5993bc39f2ce3fbaa484f1eb906de7194f624c254309c2009f0: Status 404 returned error can't find the container with id 1af696dfc79bc5993bc39f2ce3fbaa484f1eb906de7194f624c254309c2009f0 Dec 16 14:09:20 crc kubenswrapper[4757]: I1216 14:09:20.281353 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfbgg/crc-debug-x9djf" event={"ID":"490b25c3-6dc3-4c12-aeef-d9997a2542ce","Type":"ContainerStarted","Data":"1af696dfc79bc5993bc39f2ce3fbaa484f1eb906de7194f624c254309c2009f0"} Dec 16 14:09:21 crc kubenswrapper[4757]: I1216 14:09:21.181445 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 14:09:21 crc kubenswrapper[4757]: I1216 14:09:21.181788 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 14:09:21 crc kubenswrapper[4757]: I1216 14:09:21.181877 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 14:09:21 crc kubenswrapper[4757]: I1216 14:09:21.182961 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 14:09:21 crc kubenswrapper[4757]: I1216 14:09:21.183091 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" gracePeriod=600 Dec 16 14:09:21 crc kubenswrapper[4757]: E1216 14:09:21.306126 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:09:22 crc kubenswrapper[4757]: I1216 14:09:22.316030 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" exitCode=0 Dec 16 14:09:22 crc kubenswrapper[4757]: I1216 14:09:22.316574 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba"} Dec 16 14:09:22 crc kubenswrapper[4757]: I1216 14:09:22.316607 4757 scope.go:117] "RemoveContainer" containerID="c91d48e57b3f8247554192d083c76d7b456728814ff6a8c6a586d7bb61f94fa1" Dec 16 14:09:22 crc kubenswrapper[4757]: I1216 14:09:22.317212 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:09:22 crc kubenswrapper[4757]: E1216 14:09:22.317816 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:09:32 crc kubenswrapper[4757]: I1216 14:09:32.274195 4757 scope.go:117] "RemoveContainer" containerID="29d751e63c0d448053958de7adc87d31b79365f42f3c8ff2587db119a72b7014" Dec 16 14:09:32 crc kubenswrapper[4757]: I1216 14:09:32.329558 4757 scope.go:117] "RemoveContainer" containerID="5a622512f0ebb4ec1bd6414c6fcae192de82dfc2fc19ba44ea97dda903185c37" Dec 16 14:09:32 crc kubenswrapper[4757]: I1216 14:09:32.356282 4757 scope.go:117] "RemoveContainer" containerID="964cece4e8d734c187333e00db6a40ed174b001f6b5e78a8a5b42a8514cd6aeb" Dec 16 14:09:33 crc kubenswrapper[4757]: I1216 14:09:33.433808 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfbgg/crc-debug-x9djf" event={"ID":"490b25c3-6dc3-4c12-aeef-d9997a2542ce","Type":"ContainerStarted","Data":"fe21512cc091fb603b503d260b393bd57d23330e2a8e1c4fd6b98adeed59fb54"} Dec 16 14:09:33 crc kubenswrapper[4757]: I1216 14:09:33.451210 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gfbgg/crc-debug-x9djf" podStartSLOduration=1.811416841 podStartE2EDuration="14.45118925s" podCreationTimestamp="2025-12-16 14:09:19 +0000 UTC" firstStartedPulling="2025-12-16 14:09:19.52982776 +0000 UTC m=+4944.957571556" lastFinishedPulling="2025-12-16 14:09:32.169600179 +0000 UTC m=+4957.597343965" observedRunningTime="2025-12-16 14:09:33.444687992 +0000 UTC m=+4958.872431788" watchObservedRunningTime="2025-12-16 14:09:33.45118925 +0000 UTC m=+4958.878933046" Dec 16 14:09:34 crc kubenswrapper[4757]: I1216 14:09:34.957896 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:09:34 crc kubenswrapper[4757]: E1216 14:09:34.958976 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:09:49 crc kubenswrapper[4757]: I1216 14:09:49.950082 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:09:49 crc kubenswrapper[4757]: E1216 14:09:49.952301 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:10:00 crc kubenswrapper[4757]: I1216 14:10:00.950564 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:10:00 crc kubenswrapper[4757]: E1216 14:10:00.951734 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:10:14 crc kubenswrapper[4757]: I1216 14:10:14.958872 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:10:14 crc kubenswrapper[4757]: E1216 14:10:14.959607 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:10:23 crc kubenswrapper[4757]: I1216 14:10:23.948422 4757 generic.go:334] "Generic (PLEG): container finished" podID="490b25c3-6dc3-4c12-aeef-d9997a2542ce" containerID="fe21512cc091fb603b503d260b393bd57d23330e2a8e1c4fd6b98adeed59fb54" exitCode=0 Dec 16 14:10:23 crc kubenswrapper[4757]: I1216 14:10:23.948957 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfbgg/crc-debug-x9djf" event={"ID":"490b25c3-6dc3-4c12-aeef-d9997a2542ce","Type":"ContainerDied","Data":"fe21512cc091fb603b503d260b393bd57d23330e2a8e1c4fd6b98adeed59fb54"} Dec 16 14:10:25 crc kubenswrapper[4757]: I1216 14:10:25.080075 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfbgg/crc-debug-x9djf" Dec 16 14:10:25 crc kubenswrapper[4757]: I1216 14:10:25.112411 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gfbgg/crc-debug-x9djf"] Dec 16 14:10:25 crc kubenswrapper[4757]: I1216 14:10:25.123656 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gfbgg/crc-debug-x9djf"] Dec 16 14:10:25 crc kubenswrapper[4757]: I1216 14:10:25.217199 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/490b25c3-6dc3-4c12-aeef-d9997a2542ce-host\") pod \"490b25c3-6dc3-4c12-aeef-d9997a2542ce\" (UID: \"490b25c3-6dc3-4c12-aeef-d9997a2542ce\") " Dec 16 14:10:25 crc kubenswrapper[4757]: I1216 14:10:25.217321 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/490b25c3-6dc3-4c12-aeef-d9997a2542ce-host" (OuterVolumeSpecName: "host") pod "490b25c3-6dc3-4c12-aeef-d9997a2542ce" (UID: "490b25c3-6dc3-4c12-aeef-d9997a2542ce"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 14:10:25 crc kubenswrapper[4757]: I1216 14:10:25.217361 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt8dv\" (UniqueName: \"kubernetes.io/projected/490b25c3-6dc3-4c12-aeef-d9997a2542ce-kube-api-access-bt8dv\") pod \"490b25c3-6dc3-4c12-aeef-d9997a2542ce\" (UID: \"490b25c3-6dc3-4c12-aeef-d9997a2542ce\") " Dec 16 14:10:25 crc kubenswrapper[4757]: I1216 14:10:25.217925 4757 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/490b25c3-6dc3-4c12-aeef-d9997a2542ce-host\") on node \"crc\" DevicePath \"\"" Dec 16 14:10:25 crc kubenswrapper[4757]: I1216 14:10:25.222621 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/490b25c3-6dc3-4c12-aeef-d9997a2542ce-kube-api-access-bt8dv" (OuterVolumeSpecName: "kube-api-access-bt8dv") pod "490b25c3-6dc3-4c12-aeef-d9997a2542ce" (UID: "490b25c3-6dc3-4c12-aeef-d9997a2542ce"). InnerVolumeSpecName "kube-api-access-bt8dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:10:25 crc kubenswrapper[4757]: I1216 14:10:25.319948 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt8dv\" (UniqueName: \"kubernetes.io/projected/490b25c3-6dc3-4c12-aeef-d9997a2542ce-kube-api-access-bt8dv\") on node \"crc\" DevicePath \"\"" Dec 16 14:10:25 crc kubenswrapper[4757]: I1216 14:10:25.970709 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1af696dfc79bc5993bc39f2ce3fbaa484f1eb906de7194f624c254309c2009f0" Dec 16 14:10:25 crc kubenswrapper[4757]: I1216 14:10:25.970770 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfbgg/crc-debug-x9djf" Dec 16 14:10:26 crc kubenswrapper[4757]: I1216 14:10:26.294512 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gfbgg/crc-debug-frmc4"] Dec 16 14:10:26 crc kubenswrapper[4757]: E1216 14:10:26.294995 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490b25c3-6dc3-4c12-aeef-d9997a2542ce" containerName="container-00" Dec 16 14:10:26 crc kubenswrapper[4757]: I1216 14:10:26.295031 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="490b25c3-6dc3-4c12-aeef-d9997a2542ce" containerName="container-00" Dec 16 14:10:26 crc kubenswrapper[4757]: I1216 14:10:26.295263 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="490b25c3-6dc3-4c12-aeef-d9997a2542ce" containerName="container-00" Dec 16 14:10:26 crc kubenswrapper[4757]: I1216 14:10:26.295975 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfbgg/crc-debug-frmc4" Dec 16 14:10:26 crc kubenswrapper[4757]: I1216 14:10:26.440940 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a585fb4-96df-4963-bb4f-775e2e5d7ce7-host\") pod \"crc-debug-frmc4\" (UID: \"9a585fb4-96df-4963-bb4f-775e2e5d7ce7\") " pod="openshift-must-gather-gfbgg/crc-debug-frmc4" Dec 16 14:10:26 crc kubenswrapper[4757]: I1216 14:10:26.441113 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgh2j\" (UniqueName: \"kubernetes.io/projected/9a585fb4-96df-4963-bb4f-775e2e5d7ce7-kube-api-access-cgh2j\") pod \"crc-debug-frmc4\" (UID: \"9a585fb4-96df-4963-bb4f-775e2e5d7ce7\") " pod="openshift-must-gather-gfbgg/crc-debug-frmc4" Dec 16 14:10:26 crc kubenswrapper[4757]: I1216 14:10:26.542770 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a585fb4-96df-4963-bb4f-775e2e5d7ce7-host\") pod \"crc-debug-frmc4\" (UID: \"9a585fb4-96df-4963-bb4f-775e2e5d7ce7\") " pod="openshift-must-gather-gfbgg/crc-debug-frmc4" Dec 16 14:10:26 crc kubenswrapper[4757]: I1216 14:10:26.542855 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgh2j\" (UniqueName: \"kubernetes.io/projected/9a585fb4-96df-4963-bb4f-775e2e5d7ce7-kube-api-access-cgh2j\") pod \"crc-debug-frmc4\" (UID: \"9a585fb4-96df-4963-bb4f-775e2e5d7ce7\") " pod="openshift-must-gather-gfbgg/crc-debug-frmc4" Dec 16 14:10:26 crc kubenswrapper[4757]: I1216 14:10:26.542907 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a585fb4-96df-4963-bb4f-775e2e5d7ce7-host\") pod \"crc-debug-frmc4\" (UID: \"9a585fb4-96df-4963-bb4f-775e2e5d7ce7\") " pod="openshift-must-gather-gfbgg/crc-debug-frmc4" Dec 16 14:10:26 crc kubenswrapper[4757]: I1216 14:10:26.567240 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgh2j\" (UniqueName: \"kubernetes.io/projected/9a585fb4-96df-4963-bb4f-775e2e5d7ce7-kube-api-access-cgh2j\") pod \"crc-debug-frmc4\" (UID: \"9a585fb4-96df-4963-bb4f-775e2e5d7ce7\") " pod="openshift-must-gather-gfbgg/crc-debug-frmc4" Dec 16 14:10:26 crc kubenswrapper[4757]: I1216 14:10:26.626276 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfbgg/crc-debug-frmc4" Dec 16 14:10:26 crc kubenswrapper[4757]: I1216 14:10:26.950339 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:10:26 crc kubenswrapper[4757]: E1216 14:10:26.951019 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:10:26 crc kubenswrapper[4757]: I1216 14:10:26.971343 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="490b25c3-6dc3-4c12-aeef-d9997a2542ce" path="/var/lib/kubelet/pods/490b25c3-6dc3-4c12-aeef-d9997a2542ce/volumes" Dec 16 14:10:26 crc kubenswrapper[4757]: I1216 14:10:26.981785 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfbgg/crc-debug-frmc4" event={"ID":"9a585fb4-96df-4963-bb4f-775e2e5d7ce7","Type":"ContainerStarted","Data":"e8a33188027c4394c4a31ff9f6c573187f4ce38518036f88aa26ba20d4bfcf29"} Dec 16 14:10:26 crc kubenswrapper[4757]: I1216 14:10:26.981833 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfbgg/crc-debug-frmc4" event={"ID":"9a585fb4-96df-4963-bb4f-775e2e5d7ce7","Type":"ContainerStarted","Data":"a7bd069775733e3bfd2d4fdde32524ae02bf80e2a137d9d637ccf3d807cb6690"} Dec 16 14:10:27 crc kubenswrapper[4757]: I1216 14:10:27.002435 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gfbgg/crc-debug-frmc4" podStartSLOduration=1.002406597 podStartE2EDuration="1.002406597s" podCreationTimestamp="2025-12-16 14:10:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:10:26.997285933 +0000 UTC m=+5012.425029779" watchObservedRunningTime="2025-12-16 14:10:27.002406597 +0000 UTC m=+5012.430150423" Dec 16 14:10:28 crc kubenswrapper[4757]: I1216 14:10:28.001294 4757 generic.go:334] "Generic (PLEG): container finished" podID="9a585fb4-96df-4963-bb4f-775e2e5d7ce7" containerID="e8a33188027c4394c4a31ff9f6c573187f4ce38518036f88aa26ba20d4bfcf29" exitCode=0 Dec 16 14:10:28 crc kubenswrapper[4757]: I1216 14:10:28.001340 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfbgg/crc-debug-frmc4" event={"ID":"9a585fb4-96df-4963-bb4f-775e2e5d7ce7","Type":"ContainerDied","Data":"e8a33188027c4394c4a31ff9f6c573187f4ce38518036f88aa26ba20d4bfcf29"} Dec 16 14:10:29 crc kubenswrapper[4757]: I1216 14:10:29.116512 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfbgg/crc-debug-frmc4" Dec 16 14:10:29 crc kubenswrapper[4757]: I1216 14:10:29.296880 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgh2j\" (UniqueName: \"kubernetes.io/projected/9a585fb4-96df-4963-bb4f-775e2e5d7ce7-kube-api-access-cgh2j\") pod \"9a585fb4-96df-4963-bb4f-775e2e5d7ce7\" (UID: \"9a585fb4-96df-4963-bb4f-775e2e5d7ce7\") " Dec 16 14:10:29 crc kubenswrapper[4757]: I1216 14:10:29.297064 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a585fb4-96df-4963-bb4f-775e2e5d7ce7-host\") pod \"9a585fb4-96df-4963-bb4f-775e2e5d7ce7\" (UID: \"9a585fb4-96df-4963-bb4f-775e2e5d7ce7\") " Dec 16 14:10:29 crc kubenswrapper[4757]: I1216 14:10:29.297128 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a585fb4-96df-4963-bb4f-775e2e5d7ce7-host" (OuterVolumeSpecName: "host") pod "9a585fb4-96df-4963-bb4f-775e2e5d7ce7" (UID: "9a585fb4-96df-4963-bb4f-775e2e5d7ce7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 14:10:29 crc kubenswrapper[4757]: I1216 14:10:29.297518 4757 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a585fb4-96df-4963-bb4f-775e2e5d7ce7-host\") on node \"crc\" DevicePath \"\"" Dec 16 14:10:29 crc kubenswrapper[4757]: I1216 14:10:29.311798 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a585fb4-96df-4963-bb4f-775e2e5d7ce7-kube-api-access-cgh2j" (OuterVolumeSpecName: "kube-api-access-cgh2j") pod "9a585fb4-96df-4963-bb4f-775e2e5d7ce7" (UID: "9a585fb4-96df-4963-bb4f-775e2e5d7ce7"). InnerVolumeSpecName "kube-api-access-cgh2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:10:29 crc kubenswrapper[4757]: I1216 14:10:29.326977 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gfbgg/crc-debug-frmc4"] Dec 16 14:10:29 crc kubenswrapper[4757]: I1216 14:10:29.335584 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gfbgg/crc-debug-frmc4"] Dec 16 14:10:29 crc kubenswrapper[4757]: I1216 14:10:29.399734 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgh2j\" (UniqueName: \"kubernetes.io/projected/9a585fb4-96df-4963-bb4f-775e2e5d7ce7-kube-api-access-cgh2j\") on node \"crc\" DevicePath \"\"" Dec 16 14:10:30 crc kubenswrapper[4757]: I1216 14:10:30.018285 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7bd069775733e3bfd2d4fdde32524ae02bf80e2a137d9d637ccf3d807cb6690" Dec 16 14:10:30 crc kubenswrapper[4757]: I1216 14:10:30.018517 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfbgg/crc-debug-frmc4" Dec 16 14:10:30 crc kubenswrapper[4757]: I1216 14:10:30.594759 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gfbgg/crc-debug-ll6cb"] Dec 16 14:10:30 crc kubenswrapper[4757]: E1216 14:10:30.596321 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a585fb4-96df-4963-bb4f-775e2e5d7ce7" containerName="container-00" Dec 16 14:10:30 crc kubenswrapper[4757]: I1216 14:10:30.596340 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a585fb4-96df-4963-bb4f-775e2e5d7ce7" containerName="container-00" Dec 16 14:10:30 crc kubenswrapper[4757]: I1216 14:10:30.596603 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a585fb4-96df-4963-bb4f-775e2e5d7ce7" containerName="container-00" Dec 16 14:10:30 crc kubenswrapper[4757]: I1216 14:10:30.597495 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfbgg/crc-debug-ll6cb" Dec 16 14:10:30 crc kubenswrapper[4757]: I1216 14:10:30.722813 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s687\" (UniqueName: \"kubernetes.io/projected/412cabe1-c4b6-42e8-811f-3b060dd7da65-kube-api-access-8s687\") pod \"crc-debug-ll6cb\" (UID: \"412cabe1-c4b6-42e8-811f-3b060dd7da65\") " pod="openshift-must-gather-gfbgg/crc-debug-ll6cb" Dec 16 14:10:30 crc kubenswrapper[4757]: I1216 14:10:30.722874 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/412cabe1-c4b6-42e8-811f-3b060dd7da65-host\") pod \"crc-debug-ll6cb\" (UID: \"412cabe1-c4b6-42e8-811f-3b060dd7da65\") " pod="openshift-must-gather-gfbgg/crc-debug-ll6cb" Dec 16 14:10:30 crc kubenswrapper[4757]: I1216 14:10:30.825019 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s687\" (UniqueName: \"kubernetes.io/projected/412cabe1-c4b6-42e8-811f-3b060dd7da65-kube-api-access-8s687\") pod \"crc-debug-ll6cb\" (UID: \"412cabe1-c4b6-42e8-811f-3b060dd7da65\") " pod="openshift-must-gather-gfbgg/crc-debug-ll6cb" Dec 16 14:10:30 crc kubenswrapper[4757]: I1216 14:10:30.825098 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/412cabe1-c4b6-42e8-811f-3b060dd7da65-host\") pod \"crc-debug-ll6cb\" (UID: \"412cabe1-c4b6-42e8-811f-3b060dd7da65\") " pod="openshift-must-gather-gfbgg/crc-debug-ll6cb" Dec 16 14:10:30 crc kubenswrapper[4757]: I1216 14:10:30.825246 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/412cabe1-c4b6-42e8-811f-3b060dd7da65-host\") pod \"crc-debug-ll6cb\" (UID: \"412cabe1-c4b6-42e8-811f-3b060dd7da65\") " pod="openshift-must-gather-gfbgg/crc-debug-ll6cb" Dec 16 14:10:30 crc kubenswrapper[4757]: I1216 14:10:30.855748 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s687\" (UniqueName: \"kubernetes.io/projected/412cabe1-c4b6-42e8-811f-3b060dd7da65-kube-api-access-8s687\") pod \"crc-debug-ll6cb\" (UID: \"412cabe1-c4b6-42e8-811f-3b060dd7da65\") " pod="openshift-must-gather-gfbgg/crc-debug-ll6cb" Dec 16 14:10:30 crc kubenswrapper[4757]: I1216 14:10:30.916253 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfbgg/crc-debug-ll6cb" Dec 16 14:10:30 crc kubenswrapper[4757]: W1216 14:10:30.945336 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod412cabe1_c4b6_42e8_811f_3b060dd7da65.slice/crio-28f94bfe0eccc4efe81311ad84b0b384820024dd9099c2f5f12778a85afe46a5 WatchSource:0}: Error finding container 28f94bfe0eccc4efe81311ad84b0b384820024dd9099c2f5f12778a85afe46a5: Status 404 returned error can't find the container with id 28f94bfe0eccc4efe81311ad84b0b384820024dd9099c2f5f12778a85afe46a5 Dec 16 14:10:30 crc kubenswrapper[4757]: I1216 14:10:30.983671 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a585fb4-96df-4963-bb4f-775e2e5d7ce7" path="/var/lib/kubelet/pods/9a585fb4-96df-4963-bb4f-775e2e5d7ce7/volumes" Dec 16 14:10:31 crc kubenswrapper[4757]: I1216 14:10:31.037838 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfbgg/crc-debug-ll6cb" event={"ID":"412cabe1-c4b6-42e8-811f-3b060dd7da65","Type":"ContainerStarted","Data":"28f94bfe0eccc4efe81311ad84b0b384820024dd9099c2f5f12778a85afe46a5"} Dec 16 14:10:32 crc kubenswrapper[4757]: I1216 14:10:32.047166 4757 generic.go:334] "Generic (PLEG): container finished" podID="412cabe1-c4b6-42e8-811f-3b060dd7da65" containerID="df8bae154520cd4c60e3c099d870c7f0ef6fec5b0cfe7072085e1485d4497acc" exitCode=0 Dec 16 14:10:32 crc kubenswrapper[4757]: I1216 14:10:32.047254 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfbgg/crc-debug-ll6cb" event={"ID":"412cabe1-c4b6-42e8-811f-3b060dd7da65","Type":"ContainerDied","Data":"df8bae154520cd4c60e3c099d870c7f0ef6fec5b0cfe7072085e1485d4497acc"} Dec 16 14:10:32 crc kubenswrapper[4757]: I1216 14:10:32.094837 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gfbgg/crc-debug-ll6cb"] Dec 16 14:10:32 crc kubenswrapper[4757]: I1216 14:10:32.106975 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gfbgg/crc-debug-ll6cb"] Dec 16 14:10:33 crc kubenswrapper[4757]: I1216 14:10:33.671335 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfbgg/crc-debug-ll6cb" Dec 16 14:10:33 crc kubenswrapper[4757]: I1216 14:10:33.798619 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/412cabe1-c4b6-42e8-811f-3b060dd7da65-host\") pod \"412cabe1-c4b6-42e8-811f-3b060dd7da65\" (UID: \"412cabe1-c4b6-42e8-811f-3b060dd7da65\") " Dec 16 14:10:33 crc kubenswrapper[4757]: I1216 14:10:33.798764 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s687\" (UniqueName: \"kubernetes.io/projected/412cabe1-c4b6-42e8-811f-3b060dd7da65-kube-api-access-8s687\") pod \"412cabe1-c4b6-42e8-811f-3b060dd7da65\" (UID: \"412cabe1-c4b6-42e8-811f-3b060dd7da65\") " Dec 16 14:10:33 crc kubenswrapper[4757]: I1216 14:10:33.799192 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/412cabe1-c4b6-42e8-811f-3b060dd7da65-host" (OuterVolumeSpecName: "host") pod "412cabe1-c4b6-42e8-811f-3b060dd7da65" (UID: "412cabe1-c4b6-42e8-811f-3b060dd7da65"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 14:10:33 crc kubenswrapper[4757]: I1216 14:10:33.806257 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/412cabe1-c4b6-42e8-811f-3b060dd7da65-kube-api-access-8s687" (OuterVolumeSpecName: "kube-api-access-8s687") pod "412cabe1-c4b6-42e8-811f-3b060dd7da65" (UID: "412cabe1-c4b6-42e8-811f-3b060dd7da65"). InnerVolumeSpecName "kube-api-access-8s687". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:10:33 crc kubenswrapper[4757]: I1216 14:10:33.901421 4757 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/412cabe1-c4b6-42e8-811f-3b060dd7da65-host\") on node \"crc\" DevicePath \"\"" Dec 16 14:10:33 crc kubenswrapper[4757]: I1216 14:10:33.901464 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s687\" (UniqueName: \"kubernetes.io/projected/412cabe1-c4b6-42e8-811f-3b060dd7da65-kube-api-access-8s687\") on node \"crc\" DevicePath \"\"" Dec 16 14:10:34 crc kubenswrapper[4757]: I1216 14:10:34.066611 4757 scope.go:117] "RemoveContainer" containerID="df8bae154520cd4c60e3c099d870c7f0ef6fec5b0cfe7072085e1485d4497acc" Dec 16 14:10:34 crc kubenswrapper[4757]: I1216 14:10:34.066804 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfbgg/crc-debug-ll6cb" Dec 16 14:10:34 crc kubenswrapper[4757]: I1216 14:10:34.976259 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="412cabe1-c4b6-42e8-811f-3b060dd7da65" path="/var/lib/kubelet/pods/412cabe1-c4b6-42e8-811f-3b060dd7da65/volumes" Dec 16 14:10:40 crc kubenswrapper[4757]: I1216 14:10:40.950353 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:10:40 crc kubenswrapper[4757]: E1216 14:10:40.951044 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:10:51 crc kubenswrapper[4757]: I1216 14:10:51.948739 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:10:51 crc kubenswrapper[4757]: E1216 14:10:51.949585 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:10:56 crc kubenswrapper[4757]: I1216 14:10:56.155980 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cfff6bfd-qz5sk_fd29da8f-05a6-43a9-a943-c6a8a4ef8479/barbican-api/0.log" Dec 16 14:10:56 crc kubenswrapper[4757]: I1216 14:10:56.246439 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cfff6bfd-qz5sk_fd29da8f-05a6-43a9-a943-c6a8a4ef8479/barbican-api-log/0.log" Dec 16 14:10:56 crc kubenswrapper[4757]: I1216 14:10:56.402293 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d9fbfd57d-79dn2_25963bc5-afd1-4703-a583-df0d8094117d/barbican-keystone-listener/0.log" Dec 16 14:10:56 crc kubenswrapper[4757]: I1216 14:10:56.430791 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d9fbfd57d-79dn2_25963bc5-afd1-4703-a583-df0d8094117d/barbican-keystone-listener-log/0.log" Dec 16 14:10:56 crc kubenswrapper[4757]: I1216 14:10:56.670972 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-9bd99879-lw2rw_7d1df7bf-6c39-4e49-873a-701b8c05f900/barbican-worker-log/0.log" Dec 16 14:10:56 crc kubenswrapper[4757]: I1216 14:10:56.687800 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-9bd99879-lw2rw_7d1df7bf-6c39-4e49-873a-701b8c05f900/barbican-worker/0.log" Dec 16 14:10:56 crc kubenswrapper[4757]: I1216 14:10:56.809328 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg_94b4d3d7-3488-45fa-bbeb-894a4bb55ca1/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:10:57 crc kubenswrapper[4757]: I1216 14:10:57.009085 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65a6f73f-1823-407a-a9f2-c693e5ddcca9/ceilometer-central-agent/1.log" Dec 16 14:10:57 crc kubenswrapper[4757]: I1216 14:10:57.028582 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65a6f73f-1823-407a-a9f2-c693e5ddcca9/ceilometer-central-agent/0.log" Dec 16 14:10:57 crc kubenswrapper[4757]: I1216 14:10:57.079110 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65a6f73f-1823-407a-a9f2-c693e5ddcca9/ceilometer-notification-agent/1.log" Dec 16 14:10:57 crc kubenswrapper[4757]: I1216 14:10:57.286819 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65a6f73f-1823-407a-a9f2-c693e5ddcca9/sg-core/0.log" Dec 16 14:10:57 crc kubenswrapper[4757]: I1216 14:10:57.435639 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65a6f73f-1823-407a-a9f2-c693e5ddcca9/proxy-httpd/0.log" Dec 16 14:10:57 crc kubenswrapper[4757]: I1216 14:10:57.451557 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65a6f73f-1823-407a-a9f2-c693e5ddcca9/ceilometer-notification-agent/0.log" Dec 16 14:10:57 crc kubenswrapper[4757]: I1216 14:10:57.608122 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e2b303e0-e076-4589-9fb3-b51f998a293e/cinder-api/0.log" Dec 16 14:10:57 crc kubenswrapper[4757]: I1216 14:10:57.781507 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9065cfba-e560-471d-bb64-e20502e5b5d6/cinder-scheduler/0.log" Dec 16 14:10:57 crc kubenswrapper[4757]: I1216 14:10:57.821017 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e2b303e0-e076-4589-9fb3-b51f998a293e/cinder-api-log/0.log" Dec 16 14:10:57 crc kubenswrapper[4757]: I1216 14:10:57.937032 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9065cfba-e560-471d-bb64-e20502e5b5d6/probe/0.log" Dec 16 14:10:58 crc kubenswrapper[4757]: I1216 14:10:58.127676 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-kth25_0b04c40b-fcee-4a0c-b5d0-c994f3fd138e/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:10:58 crc kubenswrapper[4757]: I1216 14:10:58.208695 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-fc752_ec2b71fe-44a0-4fae-b631-f719f7d735a5/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:10:58 crc kubenswrapper[4757]: I1216 14:10:58.697166 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6d47554775-kw5w5_dae9e574-826f-4521-8b35-5c836c1cde3b/init/0.log" Dec 16 14:10:58 crc kubenswrapper[4757]: I1216 14:10:58.963357 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6d47554775-kw5w5_dae9e574-826f-4521-8b35-5c836c1cde3b/init/0.log" Dec 16 14:10:59 crc kubenswrapper[4757]: I1216 14:10:59.067503 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm_74f0d526-ef23-47fd-b475-6f799fd57ba5/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:10:59 crc kubenswrapper[4757]: I1216 14:10:59.219375 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6d47554775-kw5w5_dae9e574-826f-4521-8b35-5c836c1cde3b/dnsmasq-dns/0.log" Dec 16 14:10:59 crc kubenswrapper[4757]: I1216 14:10:59.864545 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_40de399c-634b-4d44-a9ca-0aec62a9088b/glance-log/0.log" Dec 16 14:10:59 crc kubenswrapper[4757]: I1216 14:10:59.883695 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_40de399c-634b-4d44-a9ca-0aec62a9088b/glance-httpd/0.log" Dec 16 14:11:00 crc kubenswrapper[4757]: I1216 14:11:00.134208 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4e5b2048-f283-4bad-a57a-ae09865c33f2/glance-log/0.log" Dec 16 14:11:00 crc kubenswrapper[4757]: I1216 14:11:00.138747 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4e5b2048-f283-4bad-a57a-ae09865c33f2/glance-httpd/0.log" Dec 16 14:11:00 crc kubenswrapper[4757]: I1216 14:11:00.283461 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5d66ddf65b-lmltr_65337bd1-c674-4817-91c2-ad150639205c/horizon/2.log" Dec 16 14:11:00 crc kubenswrapper[4757]: I1216 14:11:00.461157 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5d66ddf65b-lmltr_65337bd1-c674-4817-91c2-ad150639205c/horizon/1.log" Dec 16 14:11:00 crc kubenswrapper[4757]: I1216 14:11:00.616813 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt_0b943ec1-dc21-47ab-832a-d6f68f3ac17f/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:11:00 crc kubenswrapper[4757]: I1216 14:11:00.801115 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-6fz7k_cd87efc3-653f-4794-89b8-490ea0b504dd/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:11:00 crc kubenswrapper[4757]: I1216 14:11:00.830748 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5d66ddf65b-lmltr_65337bd1-c674-4817-91c2-ad150639205c/horizon-log/0.log" Dec 16 14:11:01 crc kubenswrapper[4757]: I1216 14:11:01.294299 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-66866d5f44-2mhtb_eb176388-d71c-4d06-986d-f62cb0d86fe3/keystone-api/0.log" Dec 16 14:11:01 crc kubenswrapper[4757]: I1216 14:11:01.528606 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29431561-kfrzf_82cd88c0-672b-4d50-ae86-edeae2da08a1/keystone-cron/0.log" Dec 16 14:11:01 crc kubenswrapper[4757]: I1216 14:11:01.753071 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp_d146c06e-d73a-47a2-8e1f-07ca485b1a72/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:11:01 crc kubenswrapper[4757]: I1216 14:11:01.785052 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_2457fa41-c003-450d-a55e-f67c36155f94/kube-state-metrics/0.log" Dec 16 14:11:02 crc kubenswrapper[4757]: I1216 14:11:02.366167 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl_ad61ff87-21a4-4583-83b4-65c2253f2993/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:11:02 crc kubenswrapper[4757]: I1216 14:11:02.415605 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7f64c6bbf7-pnthz_cac5be05-fb05-4246-86e5-2b8dbdbffd04/neutron-httpd/0.log" Dec 16 14:11:02 crc kubenswrapper[4757]: I1216 14:11:02.643412 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7f64c6bbf7-pnthz_cac5be05-fb05-4246-86e5-2b8dbdbffd04/neutron-api/0.log" Dec 16 14:11:02 crc kubenswrapper[4757]: I1216 14:11:02.948839 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:11:02 crc kubenswrapper[4757]: E1216 14:11:02.949124 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:11:03 crc kubenswrapper[4757]: I1216 14:11:03.199784 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_614c552b-9e07-4f84-becd-3dfa75851309/nova-cell0-conductor-conductor/0.log" Dec 16 14:11:03 crc kubenswrapper[4757]: I1216 14:11:03.523175 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f7f0c537-530b-4e4a-ae96-35ba695d26be/nova-cell1-conductor-conductor/0.log" Dec 16 14:11:03 crc kubenswrapper[4757]: I1216 14:11:03.934327 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd/nova-cell1-novncproxy-novncproxy/0.log" Dec 16 14:11:03 crc kubenswrapper[4757]: I1216 14:11:03.999708 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6aaa83ef-e285-41a7-93c0-853ecd275115/nova-api-log/0.log" Dec 16 14:11:04 crc kubenswrapper[4757]: I1216 14:11:04.233335 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-8dcjd_13b0d4c7-5eab-400a-9513-9391342fffee/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:11:04 crc kubenswrapper[4757]: I1216 14:11:04.343618 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877/nova-metadata-log/0.log" Dec 16 14:11:04 crc kubenswrapper[4757]: I1216 14:11:04.496756 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6aaa83ef-e285-41a7-93c0-853ecd275115/nova-api-api/0.log" Dec 16 14:11:04 crc kubenswrapper[4757]: I1216 14:11:04.837406 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_baba14f2-35db-422f-a583-724854b001d1/mysql-bootstrap/0.log" Dec 16 14:11:05 crc kubenswrapper[4757]: I1216 14:11:05.078019 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_fc85d441-d05f-4495-a380-1a5ed58ad631/nova-scheduler-scheduler/0.log" Dec 16 14:11:05 crc kubenswrapper[4757]: I1216 14:11:05.194775 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_baba14f2-35db-422f-a583-724854b001d1/galera/0.log" Dec 16 14:11:05 crc kubenswrapper[4757]: I1216 14:11:05.210324 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_baba14f2-35db-422f-a583-724854b001d1/mysql-bootstrap/0.log" Dec 16 14:11:05 crc kubenswrapper[4757]: I1216 14:11:05.508085 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f3e06047-7a9b-46ce-9021-a88b62993e3d/memcached/0.log" Dec 16 14:11:05 crc kubenswrapper[4757]: I1216 14:11:05.844677 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c2d16196-ea98-44e5-b859-bea9a8392c01/mysql-bootstrap/0.log" Dec 16 14:11:05 crc kubenswrapper[4757]: I1216 14:11:05.983860 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c2d16196-ea98-44e5-b859-bea9a8392c01/mysql-bootstrap/0.log" Dec 16 14:11:06 crc kubenswrapper[4757]: I1216 14:11:06.045201 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877/nova-metadata-metadata/0.log" Dec 16 14:11:06 crc kubenswrapper[4757]: I1216 14:11:06.082015 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c2d16196-ea98-44e5-b859-bea9a8392c01/galera/0.log" Dec 16 14:11:06 crc kubenswrapper[4757]: I1216 14:11:06.143535 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_891886a7-6bbd-48b7-8460-a1467bae862a/openstackclient/0.log" Dec 16 14:11:06 crc kubenswrapper[4757]: I1216 14:11:06.371430 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-d654b_e270a555-95f7-466f-8bb6-e76836a33d68/openstack-network-exporter/0.log" Dec 16 14:11:06 crc kubenswrapper[4757]: I1216 14:11:06.386139 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2mmpx_86aeade4-6bed-4d48-ab21-c43ac5b8c06b/ovsdb-server-init/0.log" Dec 16 14:11:06 crc kubenswrapper[4757]: I1216 14:11:06.563595 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2mmpx_86aeade4-6bed-4d48-ab21-c43ac5b8c06b/ovs-vswitchd/0.log" Dec 16 14:11:06 crc kubenswrapper[4757]: I1216 14:11:06.575647 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2mmpx_86aeade4-6bed-4d48-ab21-c43ac5b8c06b/ovsdb-server-init/0.log" Dec 16 14:11:06 crc kubenswrapper[4757]: I1216 14:11:06.605523 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2mmpx_86aeade4-6bed-4d48-ab21-c43ac5b8c06b/ovsdb-server/0.log" Dec 16 14:11:06 crc kubenswrapper[4757]: I1216 14:11:06.697958 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-xjblp_824c8db6-764f-4062-85c5-3c0fcbe434ce/ovn-controller/0.log" Dec 16 14:11:06 crc kubenswrapper[4757]: I1216 14:11:06.877432 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_16cd2aac-d1cc-4f30-8b86-8fd811f20f88/openstack-network-exporter/0.log" Dec 16 14:11:06 crc kubenswrapper[4757]: I1216 14:11:06.900461 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-jj4t6_85e4dfc5-8085-4270-847a-a36c8194b383/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:11:06 crc kubenswrapper[4757]: I1216 14:11:06.940572 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_16cd2aac-d1cc-4f30-8b86-8fd811f20f88/ovn-northd/0.log" Dec 16 14:11:07 crc kubenswrapper[4757]: I1216 14:11:07.139739 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_89cc68a0-15fd-4a20-bd71-9c8acb5a92c7/ovsdbserver-nb/0.log" Dec 16 14:11:07 crc kubenswrapper[4757]: I1216 14:11:07.140832 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_89cc68a0-15fd-4a20-bd71-9c8acb5a92c7/openstack-network-exporter/0.log" Dec 16 14:11:07 crc kubenswrapper[4757]: I1216 14:11:07.183990 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_972a26d6-4f3b-4fc4-8e86-055dfe33652a/openstack-network-exporter/0.log" Dec 16 14:11:07 crc kubenswrapper[4757]: I1216 14:11:07.305178 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_972a26d6-4f3b-4fc4-8e86-055dfe33652a/ovsdbserver-sb/0.log" Dec 16 14:11:07 crc kubenswrapper[4757]: I1216 14:11:07.577131 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5494d9c5f6-8dwpv_e8ba167f-6c35-410d-b690-1083c5a482ae/placement-api/0.log" Dec 16 14:11:07 crc kubenswrapper[4757]: I1216 14:11:07.597853 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_935a64f5-e332-4c06-b4df-f93ec46b7b35/setup-container/0.log" Dec 16 14:11:07 crc kubenswrapper[4757]: I1216 14:11:07.660598 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5494d9c5f6-8dwpv_e8ba167f-6c35-410d-b690-1083c5a482ae/placement-log/0.log" Dec 16 14:11:07 crc kubenswrapper[4757]: I1216 14:11:07.826229 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_935a64f5-e332-4c06-b4df-f93ec46b7b35/rabbitmq/0.log" Dec 16 14:11:07 crc kubenswrapper[4757]: I1216 14:11:07.872710 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_935a64f5-e332-4c06-b4df-f93ec46b7b35/setup-container/0.log" Dec 16 14:11:07 crc kubenswrapper[4757]: I1216 14:11:07.880458 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_268a1573-c10e-42ca-9776-222ed2186693/setup-container/0.log" Dec 16 14:11:07 crc kubenswrapper[4757]: I1216 14:11:07.995671 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_268a1573-c10e-42ca-9776-222ed2186693/setup-container/0.log" Dec 16 14:11:08 crc kubenswrapper[4757]: I1216 14:11:08.074666 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_268a1573-c10e-42ca-9776-222ed2186693/rabbitmq/0.log" Dec 16 14:11:08 crc kubenswrapper[4757]: I1216 14:11:08.076065 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9_f4c2d838-cc46-4457-9b88-5ea6eb7f14e4/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:11:08 crc kubenswrapper[4757]: I1216 14:11:08.206185 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-spb5p_95440627-f74c-45d0-a168-e8c37e8e7122/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:11:08 crc kubenswrapper[4757]: I1216 14:11:08.334861 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-kdgqh_af434566-0202-4f31-a55c-440b7ae410e6/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:11:08 crc kubenswrapper[4757]: I1216 14:11:08.377796 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-77555_2c6850c9-0076-4df2-92e7-14521aa14305/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:11:08 crc kubenswrapper[4757]: I1216 14:11:08.521620 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-xnv2c_228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c/ssh-known-hosts-edpm-deployment/0.log" Dec 16 14:11:08 crc kubenswrapper[4757]: I1216 14:11:08.682120 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-544dfc5bc-8q666_38a8b3dc-7995-4851-96db-0fb6749669b9/proxy-server/0.log" Dec 16 14:11:08 crc kubenswrapper[4757]: I1216 14:11:08.939731 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-544dfc5bc-8q666_38a8b3dc-7995-4851-96db-0fb6749669b9/proxy-httpd/0.log" Dec 16 14:11:09 crc kubenswrapper[4757]: I1216 14:11:09.002230 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-j69vw_ff595563-ea6e-4337-8018-275c60afebfb/swift-ring-rebalance/0.log" Dec 16 14:11:09 crc kubenswrapper[4757]: I1216 14:11:09.095648 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/account-auditor/0.log" Dec 16 14:11:09 crc kubenswrapper[4757]: I1216 14:11:09.169667 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/account-reaper/0.log" Dec 16 14:11:09 crc kubenswrapper[4757]: I1216 14:11:09.206865 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/account-replicator/0.log" Dec 16 14:11:09 crc kubenswrapper[4757]: I1216 14:11:09.268847 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/account-server/0.log" Dec 16 14:11:09 crc kubenswrapper[4757]: I1216 14:11:09.342256 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/container-auditor/0.log" Dec 16 14:11:09 crc kubenswrapper[4757]: I1216 14:11:09.375614 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/container-replicator/0.log" Dec 16 14:11:09 crc kubenswrapper[4757]: I1216 14:11:09.406291 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/container-updater/0.log" Dec 16 14:11:09 crc kubenswrapper[4757]: I1216 14:11:09.435307 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/container-server/0.log" Dec 16 14:11:09 crc kubenswrapper[4757]: I1216 14:11:09.532469 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/object-auditor/0.log" Dec 16 14:11:09 crc kubenswrapper[4757]: I1216 14:11:09.593290 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/object-replicator/0.log" Dec 16 14:11:09 crc kubenswrapper[4757]: I1216 14:11:09.600552 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/object-expirer/0.log" Dec 16 14:11:09 crc kubenswrapper[4757]: I1216 14:11:09.626975 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/object-server/0.log" Dec 16 14:11:09 crc kubenswrapper[4757]: I1216 14:11:09.678883 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/object-updater/0.log" Dec 16 14:11:09 crc kubenswrapper[4757]: I1216 14:11:09.808665 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/rsync/0.log" Dec 16 14:11:09 crc kubenswrapper[4757]: I1216 14:11:09.809739 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/swift-recon-cron/0.log" Dec 16 14:11:10 crc kubenswrapper[4757]: I1216 14:11:10.062795 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn_eb89db22-f667-4563-9468-97cd48c1da89/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:11:10 crc kubenswrapper[4757]: I1216 14:11:10.074192 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d2802d44-5cd2-4f45-80b0-d423d3ab6ea8/tempest-tests-tempest-tests-runner/0.log" Dec 16 14:11:10 crc kubenswrapper[4757]: I1216 14:11:10.206110 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_20af979c-4f0b-44d5-946c-fa6138ee9539/test-operator-logs-container/0.log" Dec 16 14:11:10 crc kubenswrapper[4757]: I1216 14:11:10.297528 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-nht9n_948d5531-d301-46c5-ac1a-882ceee8df96/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:11:15 crc kubenswrapper[4757]: I1216 14:11:15.948753 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:11:15 crc kubenswrapper[4757]: E1216 14:11:15.949421 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:11:26 crc kubenswrapper[4757]: I1216 14:11:26.949406 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:11:26 crc kubenswrapper[4757]: E1216 14:11:26.950338 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:11:37 crc kubenswrapper[4757]: I1216 14:11:37.111334 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp_62758af3-0127-42e3-a06f-0ba9ff4452c4/util/0.log" Dec 16 14:11:37 crc kubenswrapper[4757]: I1216 14:11:37.279496 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp_62758af3-0127-42e3-a06f-0ba9ff4452c4/pull/0.log" Dec 16 14:11:37 crc kubenswrapper[4757]: I1216 14:11:37.331363 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp_62758af3-0127-42e3-a06f-0ba9ff4452c4/util/0.log" Dec 16 14:11:37 crc kubenswrapper[4757]: I1216 14:11:37.380804 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp_62758af3-0127-42e3-a06f-0ba9ff4452c4/pull/0.log" Dec 16 14:11:37 crc kubenswrapper[4757]: I1216 14:11:37.451348 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp_62758af3-0127-42e3-a06f-0ba9ff4452c4/util/0.log" Dec 16 14:11:37 crc kubenswrapper[4757]: I1216 14:11:37.525076 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp_62758af3-0127-42e3-a06f-0ba9ff4452c4/pull/0.log" Dec 16 14:11:37 crc kubenswrapper[4757]: I1216 14:11:37.612366 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp_62758af3-0127-42e3-a06f-0ba9ff4452c4/extract/0.log" Dec 16 14:11:37 crc kubenswrapper[4757]: I1216 14:11:37.775858 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-95949466-sgvcj_34c17eba-d6e6-4399-a0a0-f25ef7a89fb9/manager/0.log" Dec 16 14:11:37 crc kubenswrapper[4757]: I1216 14:11:37.848360 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5f98b4754f-4z9jz_3717fd56-4339-4ad6-940d-b5023c76d32f/manager/1.log" Dec 16 14:11:37 crc kubenswrapper[4757]: I1216 14:11:37.909414 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5f98b4754f-4z9jz_3717fd56-4339-4ad6-940d-b5023c76d32f/manager/0.log" Dec 16 14:11:38 crc kubenswrapper[4757]: I1216 14:11:38.051129 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66f8b87655-t9vm8_a6449c1f-3695-445d-90b0-64b4c79cde05/manager/1.log" Dec 16 14:11:38 crc kubenswrapper[4757]: I1216 14:11:38.085052 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66f8b87655-t9vm8_a6449c1f-3695-445d-90b0-64b4c79cde05/manager/0.log" Dec 16 14:11:38 crc kubenswrapper[4757]: I1216 14:11:38.246518 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-59b8dcb766-lvxk6_b46e5138-a221-489d-9d7a-a54cf3938d64/manager/0.log" Dec 16 14:11:38 crc kubenswrapper[4757]: I1216 14:11:38.318359 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-767f9d7567-qtkjq_fbd5f746-9483-455c-988e-2e882623d09e/manager/0.log" Dec 16 14:11:38 crc kubenswrapper[4757]: I1216 14:11:38.472984 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6ccf486b9-w9gps_6c815add-abbd-4655-b257-d50ab074414a/manager/0.log" Dec 16 14:11:38 crc kubenswrapper[4757]: I1216 14:11:38.691672 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-f458558d7-45bgz_6333c537-0505-48c0-b197-a609084a2a2c/manager/1.log" Dec 16 14:11:38 crc kubenswrapper[4757]: I1216 14:11:38.744787 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84b495f78-fw274_904525e7-6f82-4fbf-928a-99194a97829a/manager/0.log" Dec 16 14:11:38 crc kubenswrapper[4757]: I1216 14:11:38.900190 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-f458558d7-45bgz_6333c537-0505-48c0-b197-a609084a2a2c/manager/0.log" Dec 16 14:11:38 crc kubenswrapper[4757]: I1216 14:11:38.972285 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5c7cbf548f-wgrph_72a6aea3-2309-4c98-802b-416feed1ba0f/manager/0.log" Dec 16 14:11:39 crc kubenswrapper[4757]: I1216 14:11:39.162504 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f76f4954c-psxvw_75d829d5-a3cd-48c6-8aff-07f7d325b4f9/manager/0.log" Dec 16 14:11:39 crc kubenswrapper[4757]: I1216 14:11:39.173453 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5fdd9786f7-wg8wq_eb0b72d5-d126-4513-8fbb-c7eed0a8f5e0/manager/0.log" Dec 16 14:11:39 crc kubenswrapper[4757]: I1216 14:11:39.426684 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7cd87b778f-v7jvg_545806dc-d916-4704-bc27-f5a46915fb56/manager/0.log" Dec 16 14:11:39 crc kubenswrapper[4757]: I1216 14:11:39.508316 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5fbbf8b6cc-bqfgm_83154b06-c2df-4a44-9a33-4971cd60add3/manager/0.log" Dec 16 14:11:39 crc kubenswrapper[4757]: I1216 14:11:39.632466 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-68c649d9d-dxgr9_e9f15431-d8cd-408d-8169-e06457cabccc/manager/1.log" Dec 16 14:11:39 crc kubenswrapper[4757]: I1216 14:11:39.746383 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-68c649d9d-dxgr9_e9f15431-d8cd-408d-8169-e06457cabccc/manager/0.log" Dec 16 14:11:39 crc kubenswrapper[4757]: I1216 14:11:39.796807 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9_67c1a6c7-35d1-48a9-a058-13e5d5599fe7/manager/1.log" Dec 16 14:11:39 crc kubenswrapper[4757]: I1216 14:11:39.866672 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9_67c1a6c7-35d1-48a9-a058-13e5d5599fe7/manager/0.log" Dec 16 14:11:39 crc kubenswrapper[4757]: I1216 14:11:39.948959 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:11:39 crc kubenswrapper[4757]: E1216 14:11:39.949289 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:11:40 crc kubenswrapper[4757]: I1216 14:11:40.356105 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-56fbb56c9b-wtj5t_4e6ed2fb-5d23-44a6-9967-6cc3ed88a1b0/operator/0.log" Dec 16 14:11:40 crc kubenswrapper[4757]: I1216 14:11:40.396033 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-cn9nk_b56f5192-de72-41a1-b733-edd456541eda/registry-server/0.log" Dec 16 14:11:40 crc kubenswrapper[4757]: I1216 14:11:40.758935 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bf6d4f946-x72bf_60821702-232d-4eb4-b70f-15e87e070aed/manager/0.log" Dec 16 14:11:40 crc kubenswrapper[4757]: I1216 14:11:40.837682 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8665b56d78-bk25g_28ec7b61-2e0c-4ad7-8569-eeb5973b976d/manager/0.log" Dec 16 14:11:41 crc kubenswrapper[4757]: I1216 14:11:41.105707 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-gcxgg_7432087f-983f-4b3d-af98-40238ceba951/operator/0.log" Dec 16 14:11:41 crc kubenswrapper[4757]: I1216 14:11:41.178607 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-554cfb9dfb-d6w2k_7b6693c4-d7ad-4edc-ba55-baa2fea5094a/manager/0.log" Dec 16 14:11:41 crc kubenswrapper[4757]: I1216 14:11:41.307968 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5c6df8f9-wgxxx_120aab20-c2fb-441d-9c07-bd05c0678a11/manager/0.log" Dec 16 14:11:41 crc kubenswrapper[4757]: I1216 14:11:41.464454 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-97d456b9-85zkh_ac2d53dd-c297-44b1-bcb1-a3025530eb5c/manager/0.log" Dec 16 14:11:41 crc kubenswrapper[4757]: I1216 14:11:41.515196 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-756ccf86c7-w6kx8_42d952f0-a650-484d-9e6b-b1c6c0f252dc/manager/1.log" Dec 16 14:11:41 crc kubenswrapper[4757]: I1216 14:11:41.672648 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-756ccf86c7-w6kx8_42d952f0-a650-484d-9e6b-b1c6c0f252dc/manager/0.log" Dec 16 14:11:41 crc kubenswrapper[4757]: I1216 14:11:41.708555 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-55f78b7c4c-dr2qv_ee4f2b54-1f7b-469d-9d41-ad4d57c3bf89/manager/0.log" Dec 16 14:11:54 crc kubenswrapper[4757]: I1216 14:11:54.960192 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:11:54 crc kubenswrapper[4757]: E1216 14:11:54.961313 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:12:03 crc kubenswrapper[4757]: I1216 14:12:03.252515 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qr4dp_cf6bf9c1-5d43-4ab3-a38f-d96308345ff4/control-plane-machine-set-operator/0.log" Dec 16 14:12:03 crc kubenswrapper[4757]: I1216 14:12:03.387906 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wg8vk_ff965e39-8bf4-40d8-b7af-702f0c47bbb4/kube-rbac-proxy/0.log" Dec 16 14:12:03 crc kubenswrapper[4757]: I1216 14:12:03.456258 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wg8vk_ff965e39-8bf4-40d8-b7af-702f0c47bbb4/machine-api-operator/0.log" Dec 16 14:12:08 crc kubenswrapper[4757]: I1216 14:12:08.949771 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:12:08 crc kubenswrapper[4757]: E1216 14:12:08.951434 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:12:16 crc kubenswrapper[4757]: I1216 14:12:16.490529 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-82cdt_1a132cd7-a7ae-476a-ad05-9a2ec1981349/cert-manager-cainjector/0.log" Dec 16 14:12:16 crc kubenswrapper[4757]: I1216 14:12:16.507949 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-dd8lt_fc655451-6b29-42c9-836d-ae8ae9a5d77b/cert-manager-controller/0.log" Dec 16 14:12:16 crc kubenswrapper[4757]: I1216 14:12:16.648912 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-cgxqx_04fe3c89-14a7-4830-b290-538d3ae20a12/cert-manager-webhook/0.log" Dec 16 14:12:23 crc kubenswrapper[4757]: I1216 14:12:23.948910 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:12:23 crc kubenswrapper[4757]: E1216 14:12:23.949670 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:12:32 crc kubenswrapper[4757]: I1216 14:12:32.354527 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6ff7998486-b9shc_536f7375-828d-41f1-afd3-509271873ae2/nmstate-console-plugin/0.log" Dec 16 14:12:32 crc kubenswrapper[4757]: I1216 14:12:32.629937 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-8dspc_7a0c9c35-f5d3-4503-831d-840fdc460911/nmstate-handler/0.log" Dec 16 14:12:32 crc kubenswrapper[4757]: I1216 14:12:32.735982 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-gj6pf_e6a65757-6cab-4b58-a399-d186414d6485/kube-rbac-proxy/0.log" Dec 16 14:12:32 crc kubenswrapper[4757]: I1216 14:12:32.796339 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-gj6pf_e6a65757-6cab-4b58-a399-d186414d6485/nmstate-metrics/0.log" Dec 16 14:12:32 crc kubenswrapper[4757]: I1216 14:12:32.941379 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6769fb99d-crfw5_13ed71f2-85e0-4dc6-94c1-82e12982c67f/nmstate-operator/0.log" Dec 16 14:12:33 crc kubenswrapper[4757]: I1216 14:12:33.030898 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-f8fb84555-l6pg2_97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922/nmstate-webhook/0.log" Dec 16 14:12:34 crc kubenswrapper[4757]: I1216 14:12:34.954849 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:12:34 crc kubenswrapper[4757]: E1216 14:12:34.955790 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:12:47 crc kubenswrapper[4757]: I1216 14:12:47.949181 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:12:47 crc kubenswrapper[4757]: E1216 14:12:47.951186 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:12:53 crc kubenswrapper[4757]: I1216 14:12:53.667409 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-jwvt9_dd784875-6828-4554-8791-24182d80b82f/kube-rbac-proxy/0.log" Dec 16 14:12:53 crc kubenswrapper[4757]: I1216 14:12:53.730755 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-jwvt9_dd784875-6828-4554-8791-24182d80b82f/controller/0.log" Dec 16 14:12:53 crc kubenswrapper[4757]: I1216 14:12:53.929605 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-frr-files/0.log" Dec 16 14:12:54 crc kubenswrapper[4757]: I1216 14:12:54.402963 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-metrics/0.log" Dec 16 14:12:54 crc kubenswrapper[4757]: I1216 14:12:54.405507 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-reloader/0.log" Dec 16 14:12:54 crc kubenswrapper[4757]: I1216 14:12:54.408935 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-frr-files/0.log" Dec 16 14:12:54 crc kubenswrapper[4757]: I1216 14:12:54.497542 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-reloader/0.log" Dec 16 14:12:54 crc kubenswrapper[4757]: I1216 14:12:54.660479 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-frr-files/0.log" Dec 16 14:12:54 crc kubenswrapper[4757]: I1216 14:12:54.661717 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-reloader/0.log" Dec 16 14:12:54 crc kubenswrapper[4757]: I1216 14:12:54.686642 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-metrics/0.log" Dec 16 14:12:54 crc kubenswrapper[4757]: I1216 14:12:54.774766 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-metrics/0.log" Dec 16 14:12:54 crc kubenswrapper[4757]: I1216 14:12:54.972082 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-reloader/0.log" Dec 16 14:12:55 crc kubenswrapper[4757]: I1216 14:12:55.042921 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-metrics/0.log" Dec 16 14:12:55 crc kubenswrapper[4757]: I1216 14:12:55.043399 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/controller/0.log" Dec 16 14:12:55 crc kubenswrapper[4757]: I1216 14:12:55.051351 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-frr-files/0.log" Dec 16 14:12:55 crc kubenswrapper[4757]: I1216 14:12:55.302139 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/frr-metrics/0.log" Dec 16 14:12:55 crc kubenswrapper[4757]: I1216 14:12:55.311776 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/kube-rbac-proxy/0.log" Dec 16 14:12:55 crc kubenswrapper[4757]: I1216 14:12:55.431636 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/kube-rbac-proxy-frr/0.log" Dec 16 14:12:55 crc kubenswrapper[4757]: I1216 14:12:55.568174 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/reloader/0.log" Dec 16 14:12:55 crc kubenswrapper[4757]: I1216 14:12:55.848936 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-cwgds_97dcbeac-b6c6-4892-91fd-4cd45b2b1f9b/frr-k8s-webhook-server/0.log" Dec 16 14:12:56 crc kubenswrapper[4757]: I1216 14:12:56.140455 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84c894d85c-x59lp_019f84e1-6fee-4829-a087-c756c955060a/manager/1.log" Dec 16 14:12:56 crc kubenswrapper[4757]: I1216 14:12:56.233547 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84c894d85c-x59lp_019f84e1-6fee-4829-a087-c756c955060a/manager/0.log" Dec 16 14:12:56 crc kubenswrapper[4757]: I1216 14:12:56.526384 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7b74cd5c78-9s4s8_93d50c8f-84bf-4f97-96ab-98cbbd370476/webhook-server/0.log" Dec 16 14:12:56 crc kubenswrapper[4757]: I1216 14:12:56.668825 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-996kc_8074db35-8766-4cc2-bc06-be8a150f92e9/kube-rbac-proxy/0.log" Dec 16 14:12:56 crc kubenswrapper[4757]: I1216 14:12:56.698439 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/frr/0.log" Dec 16 14:12:57 crc kubenswrapper[4757]: I1216 14:12:57.046904 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-996kc_8074db35-8766-4cc2-bc06-be8a150f92e9/speaker/0.log" Dec 16 14:12:59 crc kubenswrapper[4757]: I1216 14:12:59.949285 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:12:59 crc kubenswrapper[4757]: E1216 14:12:59.949886 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:13:11 crc kubenswrapper[4757]: I1216 14:13:11.429185 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx_6bdd3557-11e0-4bc6-b282-80d9d44ecac4/util/0.log" Dec 16 14:13:11 crc kubenswrapper[4757]: I1216 14:13:11.897888 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx_6bdd3557-11e0-4bc6-b282-80d9d44ecac4/pull/0.log" Dec 16 14:13:11 crc kubenswrapper[4757]: I1216 14:13:11.949415 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:13:11 crc kubenswrapper[4757]: E1216 14:13:11.949631 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:13:12 crc kubenswrapper[4757]: I1216 14:13:12.001467 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx_6bdd3557-11e0-4bc6-b282-80d9d44ecac4/pull/0.log" Dec 16 14:13:12 crc kubenswrapper[4757]: I1216 14:13:12.040377 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx_6bdd3557-11e0-4bc6-b282-80d9d44ecac4/util/0.log" Dec 16 14:13:12 crc kubenswrapper[4757]: I1216 14:13:12.342649 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx_6bdd3557-11e0-4bc6-b282-80d9d44ecac4/util/0.log" Dec 16 14:13:12 crc kubenswrapper[4757]: I1216 14:13:12.359673 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx_6bdd3557-11e0-4bc6-b282-80d9d44ecac4/pull/0.log" Dec 16 14:13:12 crc kubenswrapper[4757]: I1216 14:13:12.445355 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx_6bdd3557-11e0-4bc6-b282-80d9d44ecac4/extract/0.log" Dec 16 14:13:12 crc kubenswrapper[4757]: I1216 14:13:12.662062 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct_4aeee37c-9135-4d1a-8e52-181ea394acc6/util/0.log" Dec 16 14:13:12 crc kubenswrapper[4757]: I1216 14:13:12.798513 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct_4aeee37c-9135-4d1a-8e52-181ea394acc6/util/0.log" Dec 16 14:13:12 crc kubenswrapper[4757]: I1216 14:13:12.842952 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct_4aeee37c-9135-4d1a-8e52-181ea394acc6/pull/0.log" Dec 16 14:13:12 crc kubenswrapper[4757]: I1216 14:13:12.844854 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct_4aeee37c-9135-4d1a-8e52-181ea394acc6/pull/0.log" Dec 16 14:13:13 crc kubenswrapper[4757]: I1216 14:13:13.071372 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct_4aeee37c-9135-4d1a-8e52-181ea394acc6/pull/0.log" Dec 16 14:13:13 crc kubenswrapper[4757]: I1216 14:13:13.078309 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct_4aeee37c-9135-4d1a-8e52-181ea394acc6/extract/0.log" Dec 16 14:13:13 crc kubenswrapper[4757]: I1216 14:13:13.091012 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct_4aeee37c-9135-4d1a-8e52-181ea394acc6/util/0.log" Dec 16 14:13:13 crc kubenswrapper[4757]: I1216 14:13:13.249871 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vnn8x_91c99d1e-682b-4fc3-a4de-594f16bfb4d7/extract-utilities/0.log" Dec 16 14:13:13 crc kubenswrapper[4757]: I1216 14:13:13.455981 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vnn8x_91c99d1e-682b-4fc3-a4de-594f16bfb4d7/extract-content/0.log" Dec 16 14:13:13 crc kubenswrapper[4757]: I1216 14:13:13.481971 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vnn8x_91c99d1e-682b-4fc3-a4de-594f16bfb4d7/extract-utilities/0.log" Dec 16 14:13:13 crc kubenswrapper[4757]: I1216 14:13:13.496452 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vnn8x_91c99d1e-682b-4fc3-a4de-594f16bfb4d7/extract-content/0.log" Dec 16 14:13:13 crc kubenswrapper[4757]: I1216 14:13:13.647081 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vnn8x_91c99d1e-682b-4fc3-a4de-594f16bfb4d7/extract-utilities/0.log" Dec 16 14:13:13 crc kubenswrapper[4757]: I1216 14:13:13.721961 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vnn8x_91c99d1e-682b-4fc3-a4de-594f16bfb4d7/extract-content/0.log" Dec 16 14:13:13 crc kubenswrapper[4757]: I1216 14:13:13.991550 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9fpvh_87336683-d0ae-4df9-91a8-881fa54e49b9/extract-utilities/0.log" Dec 16 14:13:14 crc kubenswrapper[4757]: I1216 14:13:14.198340 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9fpvh_87336683-d0ae-4df9-91a8-881fa54e49b9/extract-content/0.log" Dec 16 14:13:14 crc kubenswrapper[4757]: I1216 14:13:14.238880 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9fpvh_87336683-d0ae-4df9-91a8-881fa54e49b9/extract-utilities/0.log" Dec 16 14:13:14 crc kubenswrapper[4757]: I1216 14:13:14.329981 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9fpvh_87336683-d0ae-4df9-91a8-881fa54e49b9/extract-content/0.log" Dec 16 14:13:14 crc kubenswrapper[4757]: I1216 14:13:14.340722 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vnn8x_91c99d1e-682b-4fc3-a4de-594f16bfb4d7/registry-server/0.log" Dec 16 14:13:14 crc kubenswrapper[4757]: I1216 14:13:14.606990 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9fpvh_87336683-d0ae-4df9-91a8-881fa54e49b9/extract-content/0.log" Dec 16 14:13:14 crc kubenswrapper[4757]: I1216 14:13:14.608453 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9fpvh_87336683-d0ae-4df9-91a8-881fa54e49b9/extract-utilities/0.log" Dec 16 14:13:14 crc kubenswrapper[4757]: I1216 14:13:14.985786 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tw282_52fab5a3-5b60-47e2-a517-37c8d9adc3c1/extract-utilities/0.log" Dec 16 14:13:15 crc kubenswrapper[4757]: I1216 14:13:15.006074 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-z52vp_b40bf055-8b99-4c86-9e45-ed2253aa09a1/marketplace-operator/0.log" Dec 16 14:13:15 crc kubenswrapper[4757]: I1216 14:13:15.294253 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tw282_52fab5a3-5b60-47e2-a517-37c8d9adc3c1/extract-content/0.log" Dec 16 14:13:15 crc kubenswrapper[4757]: I1216 14:13:15.342061 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tw282_52fab5a3-5b60-47e2-a517-37c8d9adc3c1/extract-utilities/0.log" Dec 16 14:13:15 crc kubenswrapper[4757]: I1216 14:13:15.377762 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tw282_52fab5a3-5b60-47e2-a517-37c8d9adc3c1/extract-content/0.log" Dec 16 14:13:15 crc kubenswrapper[4757]: I1216 14:13:15.432044 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9fpvh_87336683-d0ae-4df9-91a8-881fa54e49b9/registry-server/0.log" Dec 16 14:13:15 crc kubenswrapper[4757]: I1216 14:13:15.906666 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tw282_52fab5a3-5b60-47e2-a517-37c8d9adc3c1/extract-content/0.log" Dec 16 14:13:15 crc kubenswrapper[4757]: I1216 14:13:15.946175 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8djct_4e45bdb9-422b-4864-b3fc-4aeab70108a3/extract-utilities/0.log" Dec 16 14:13:15 crc kubenswrapper[4757]: I1216 14:13:15.963052 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tw282_52fab5a3-5b60-47e2-a517-37c8d9adc3c1/extract-utilities/0.log" Dec 16 14:13:16 crc kubenswrapper[4757]: I1216 14:13:16.171323 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tw282_52fab5a3-5b60-47e2-a517-37c8d9adc3c1/registry-server/0.log" Dec 16 14:13:16 crc kubenswrapper[4757]: I1216 14:13:16.242974 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8djct_4e45bdb9-422b-4864-b3fc-4aeab70108a3/extract-content/0.log" Dec 16 14:13:16 crc kubenswrapper[4757]: I1216 14:13:16.274459 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8djct_4e45bdb9-422b-4864-b3fc-4aeab70108a3/extract-utilities/0.log" Dec 16 14:13:16 crc kubenswrapper[4757]: I1216 14:13:16.281586 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8djct_4e45bdb9-422b-4864-b3fc-4aeab70108a3/extract-content/0.log" Dec 16 14:13:16 crc kubenswrapper[4757]: I1216 14:13:16.482844 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8djct_4e45bdb9-422b-4864-b3fc-4aeab70108a3/extract-utilities/0.log" Dec 16 14:13:16 crc kubenswrapper[4757]: I1216 14:13:16.528116 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8djct_4e45bdb9-422b-4864-b3fc-4aeab70108a3/extract-content/0.log" Dec 16 14:13:16 crc kubenswrapper[4757]: I1216 14:13:16.743547 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8djct_4e45bdb9-422b-4864-b3fc-4aeab70108a3/registry-server/0.log" Dec 16 14:13:25 crc kubenswrapper[4757]: I1216 14:13:25.299730 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vgwdh"] Dec 16 14:13:25 crc kubenswrapper[4757]: E1216 14:13:25.300545 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412cabe1-c4b6-42e8-811f-3b060dd7da65" containerName="container-00" Dec 16 14:13:25 crc kubenswrapper[4757]: I1216 14:13:25.300557 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="412cabe1-c4b6-42e8-811f-3b060dd7da65" containerName="container-00" Dec 16 14:13:25 crc kubenswrapper[4757]: I1216 14:13:25.300752 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="412cabe1-c4b6-42e8-811f-3b060dd7da65" containerName="container-00" Dec 16 14:13:25 crc kubenswrapper[4757]: I1216 14:13:25.302058 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgwdh" Dec 16 14:13:25 crc kubenswrapper[4757]: I1216 14:13:25.320520 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vgwdh"] Dec 16 14:13:25 crc kubenswrapper[4757]: I1216 14:13:25.461733 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5c68\" (UniqueName: \"kubernetes.io/projected/7a9a5919-e16e-4eec-8264-9d171635759e-kube-api-access-m5c68\") pod \"certified-operators-vgwdh\" (UID: \"7a9a5919-e16e-4eec-8264-9d171635759e\") " pod="openshift-marketplace/certified-operators-vgwdh" Dec 16 14:13:25 crc kubenswrapper[4757]: I1216 14:13:25.462259 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9a5919-e16e-4eec-8264-9d171635759e-utilities\") pod \"certified-operators-vgwdh\" (UID: \"7a9a5919-e16e-4eec-8264-9d171635759e\") " pod="openshift-marketplace/certified-operators-vgwdh" Dec 16 14:13:25 crc kubenswrapper[4757]: I1216 14:13:25.462327 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9a5919-e16e-4eec-8264-9d171635759e-catalog-content\") pod \"certified-operators-vgwdh\" (UID: \"7a9a5919-e16e-4eec-8264-9d171635759e\") " pod="openshift-marketplace/certified-operators-vgwdh" Dec 16 14:13:25 crc kubenswrapper[4757]: I1216 14:13:25.564063 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9a5919-e16e-4eec-8264-9d171635759e-catalog-content\") pod \"certified-operators-vgwdh\" (UID: \"7a9a5919-e16e-4eec-8264-9d171635759e\") " pod="openshift-marketplace/certified-operators-vgwdh" Dec 16 14:13:25 crc kubenswrapper[4757]: I1216 14:13:25.564282 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5c68\" (UniqueName: \"kubernetes.io/projected/7a9a5919-e16e-4eec-8264-9d171635759e-kube-api-access-m5c68\") pod \"certified-operators-vgwdh\" (UID: \"7a9a5919-e16e-4eec-8264-9d171635759e\") " pod="openshift-marketplace/certified-operators-vgwdh" Dec 16 14:13:25 crc kubenswrapper[4757]: I1216 14:13:25.564377 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9a5919-e16e-4eec-8264-9d171635759e-utilities\") pod \"certified-operators-vgwdh\" (UID: \"7a9a5919-e16e-4eec-8264-9d171635759e\") " pod="openshift-marketplace/certified-operators-vgwdh" Dec 16 14:13:25 crc kubenswrapper[4757]: I1216 14:13:25.565134 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9a5919-e16e-4eec-8264-9d171635759e-utilities\") pod \"certified-operators-vgwdh\" (UID: \"7a9a5919-e16e-4eec-8264-9d171635759e\") " pod="openshift-marketplace/certified-operators-vgwdh" Dec 16 14:13:25 crc kubenswrapper[4757]: I1216 14:13:25.565149 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9a5919-e16e-4eec-8264-9d171635759e-catalog-content\") pod \"certified-operators-vgwdh\" (UID: \"7a9a5919-e16e-4eec-8264-9d171635759e\") " pod="openshift-marketplace/certified-operators-vgwdh" Dec 16 14:13:25 crc kubenswrapper[4757]: I1216 14:13:25.586440 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5c68\" (UniqueName: \"kubernetes.io/projected/7a9a5919-e16e-4eec-8264-9d171635759e-kube-api-access-m5c68\") pod \"certified-operators-vgwdh\" (UID: \"7a9a5919-e16e-4eec-8264-9d171635759e\") " pod="openshift-marketplace/certified-operators-vgwdh" Dec 16 14:13:25 crc kubenswrapper[4757]: I1216 14:13:25.624329 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgwdh" Dec 16 14:13:26 crc kubenswrapper[4757]: I1216 14:13:26.252134 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vgwdh"] Dec 16 14:13:26 crc kubenswrapper[4757]: I1216 14:13:26.612232 4757 generic.go:334] "Generic (PLEG): container finished" podID="7a9a5919-e16e-4eec-8264-9d171635759e" containerID="d587a5eac1622dd42ca2f2f7dfb8d045b1dbcb0e3da7ebeb67deb45a2cf26c9b" exitCode=0 Dec 16 14:13:26 crc kubenswrapper[4757]: I1216 14:13:26.612556 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgwdh" event={"ID":"7a9a5919-e16e-4eec-8264-9d171635759e","Type":"ContainerDied","Data":"d587a5eac1622dd42ca2f2f7dfb8d045b1dbcb0e3da7ebeb67deb45a2cf26c9b"} Dec 16 14:13:26 crc kubenswrapper[4757]: I1216 14:13:26.612587 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgwdh" event={"ID":"7a9a5919-e16e-4eec-8264-9d171635759e","Type":"ContainerStarted","Data":"3d0f8a3d961cbe643f20d6683e8e86982761efef15e8f62e5d9264ac83f9f71d"} Dec 16 14:13:26 crc kubenswrapper[4757]: I1216 14:13:26.950063 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:13:26 crc kubenswrapper[4757]: E1216 14:13:26.950529 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:13:28 crc kubenswrapper[4757]: I1216 14:13:28.629667 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgwdh" event={"ID":"7a9a5919-e16e-4eec-8264-9d171635759e","Type":"ContainerStarted","Data":"45644d6d3ae1f1c5e04d91f743ec92766a3bc4c510dfa9722dc994df9e24063e"} Dec 16 14:13:29 crc kubenswrapper[4757]: I1216 14:13:29.638023 4757 generic.go:334] "Generic (PLEG): container finished" podID="7a9a5919-e16e-4eec-8264-9d171635759e" containerID="45644d6d3ae1f1c5e04d91f743ec92766a3bc4c510dfa9722dc994df9e24063e" exitCode=0 Dec 16 14:13:29 crc kubenswrapper[4757]: I1216 14:13:29.638276 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgwdh" event={"ID":"7a9a5919-e16e-4eec-8264-9d171635759e","Type":"ContainerDied","Data":"45644d6d3ae1f1c5e04d91f743ec92766a3bc4c510dfa9722dc994df9e24063e"} Dec 16 14:13:30 crc kubenswrapper[4757]: I1216 14:13:30.649821 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgwdh" event={"ID":"7a9a5919-e16e-4eec-8264-9d171635759e","Type":"ContainerStarted","Data":"5ec8cd9d65258c064c8fb32574e28465c7fae5c290389116604731a9363fdaf5"} Dec 16 14:13:30 crc kubenswrapper[4757]: I1216 14:13:30.677465 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vgwdh" podStartSLOduration=2.27474495 podStartE2EDuration="5.6774436s" podCreationTimestamp="2025-12-16 14:13:25 +0000 UTC" firstStartedPulling="2025-12-16 14:13:26.614877504 +0000 UTC m=+5192.042621310" lastFinishedPulling="2025-12-16 14:13:30.017576164 +0000 UTC m=+5195.445319960" observedRunningTime="2025-12-16 14:13:30.666533191 +0000 UTC m=+5196.094277007" watchObservedRunningTime="2025-12-16 14:13:30.6774436 +0000 UTC m=+5196.105187416" Dec 16 14:13:35 crc kubenswrapper[4757]: I1216 14:13:35.625760 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vgwdh" Dec 16 14:13:35 crc kubenswrapper[4757]: I1216 14:13:35.626215 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vgwdh" Dec 16 14:13:35 crc kubenswrapper[4757]: I1216 14:13:35.672669 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vgwdh" Dec 16 14:13:35 crc kubenswrapper[4757]: I1216 14:13:35.789152 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vgwdh" Dec 16 14:13:37 crc kubenswrapper[4757]: I1216 14:13:37.886500 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vgwdh"] Dec 16 14:13:37 crc kubenswrapper[4757]: I1216 14:13:37.887117 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vgwdh" podUID="7a9a5919-e16e-4eec-8264-9d171635759e" containerName="registry-server" containerID="cri-o://5ec8cd9d65258c064c8fb32574e28465c7fae5c290389116604731a9363fdaf5" gracePeriod=2 Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.521080 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgwdh" Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.678831 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9a5919-e16e-4eec-8264-9d171635759e-utilities\") pod \"7a9a5919-e16e-4eec-8264-9d171635759e\" (UID: \"7a9a5919-e16e-4eec-8264-9d171635759e\") " Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.679088 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9a5919-e16e-4eec-8264-9d171635759e-catalog-content\") pod \"7a9a5919-e16e-4eec-8264-9d171635759e\" (UID: \"7a9a5919-e16e-4eec-8264-9d171635759e\") " Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.679196 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5c68\" (UniqueName: \"kubernetes.io/projected/7a9a5919-e16e-4eec-8264-9d171635759e-kube-api-access-m5c68\") pod \"7a9a5919-e16e-4eec-8264-9d171635759e\" (UID: \"7a9a5919-e16e-4eec-8264-9d171635759e\") " Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.679400 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a9a5919-e16e-4eec-8264-9d171635759e-utilities" (OuterVolumeSpecName: "utilities") pod "7a9a5919-e16e-4eec-8264-9d171635759e" (UID: "7a9a5919-e16e-4eec-8264-9d171635759e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.679727 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9a5919-e16e-4eec-8264-9d171635759e-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.686123 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a9a5919-e16e-4eec-8264-9d171635759e-kube-api-access-m5c68" (OuterVolumeSpecName: "kube-api-access-m5c68") pod "7a9a5919-e16e-4eec-8264-9d171635759e" (UID: "7a9a5919-e16e-4eec-8264-9d171635759e"). InnerVolumeSpecName "kube-api-access-m5c68". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.725314 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a9a5919-e16e-4eec-8264-9d171635759e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a9a5919-e16e-4eec-8264-9d171635759e" (UID: "7a9a5919-e16e-4eec-8264-9d171635759e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.767138 4757 generic.go:334] "Generic (PLEG): container finished" podID="7a9a5919-e16e-4eec-8264-9d171635759e" containerID="5ec8cd9d65258c064c8fb32574e28465c7fae5c290389116604731a9363fdaf5" exitCode=0 Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.767178 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgwdh" event={"ID":"7a9a5919-e16e-4eec-8264-9d171635759e","Type":"ContainerDied","Data":"5ec8cd9d65258c064c8fb32574e28465c7fae5c290389116604731a9363fdaf5"} Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.767202 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgwdh" event={"ID":"7a9a5919-e16e-4eec-8264-9d171635759e","Type":"ContainerDied","Data":"3d0f8a3d961cbe643f20d6683e8e86982761efef15e8f62e5d9264ac83f9f71d"} Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.767219 4757 scope.go:117] "RemoveContainer" containerID="5ec8cd9d65258c064c8fb32574e28465c7fae5c290389116604731a9363fdaf5" Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.767345 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgwdh" Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.781398 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9a5919-e16e-4eec-8264-9d171635759e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.781425 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5c68\" (UniqueName: \"kubernetes.io/projected/7a9a5919-e16e-4eec-8264-9d171635759e-kube-api-access-m5c68\") on node \"crc\" DevicePath \"\"" Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.788555 4757 scope.go:117] "RemoveContainer" containerID="45644d6d3ae1f1c5e04d91f743ec92766a3bc4c510dfa9722dc994df9e24063e" Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.823212 4757 scope.go:117] "RemoveContainer" containerID="d587a5eac1622dd42ca2f2f7dfb8d045b1dbcb0e3da7ebeb67deb45a2cf26c9b" Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.823358 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vgwdh"] Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.850304 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vgwdh"] Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.864434 4757 scope.go:117] "RemoveContainer" containerID="5ec8cd9d65258c064c8fb32574e28465c7fae5c290389116604731a9363fdaf5" Dec 16 14:13:38 crc kubenswrapper[4757]: E1216 14:13:38.864893 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ec8cd9d65258c064c8fb32574e28465c7fae5c290389116604731a9363fdaf5\": container with ID starting with 5ec8cd9d65258c064c8fb32574e28465c7fae5c290389116604731a9363fdaf5 not found: ID does not exist" containerID="5ec8cd9d65258c064c8fb32574e28465c7fae5c290389116604731a9363fdaf5" Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.864940 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ec8cd9d65258c064c8fb32574e28465c7fae5c290389116604731a9363fdaf5"} err="failed to get container status \"5ec8cd9d65258c064c8fb32574e28465c7fae5c290389116604731a9363fdaf5\": rpc error: code = NotFound desc = could not find container \"5ec8cd9d65258c064c8fb32574e28465c7fae5c290389116604731a9363fdaf5\": container with ID starting with 5ec8cd9d65258c064c8fb32574e28465c7fae5c290389116604731a9363fdaf5 not found: ID does not exist" Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.864965 4757 scope.go:117] "RemoveContainer" containerID="45644d6d3ae1f1c5e04d91f743ec92766a3bc4c510dfa9722dc994df9e24063e" Dec 16 14:13:38 crc kubenswrapper[4757]: E1216 14:13:38.865385 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45644d6d3ae1f1c5e04d91f743ec92766a3bc4c510dfa9722dc994df9e24063e\": container with ID starting with 45644d6d3ae1f1c5e04d91f743ec92766a3bc4c510dfa9722dc994df9e24063e not found: ID does not exist" containerID="45644d6d3ae1f1c5e04d91f743ec92766a3bc4c510dfa9722dc994df9e24063e" Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.865405 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45644d6d3ae1f1c5e04d91f743ec92766a3bc4c510dfa9722dc994df9e24063e"} err="failed to get container status \"45644d6d3ae1f1c5e04d91f743ec92766a3bc4c510dfa9722dc994df9e24063e\": rpc error: code = NotFound desc = could not find container \"45644d6d3ae1f1c5e04d91f743ec92766a3bc4c510dfa9722dc994df9e24063e\": container with ID starting with 45644d6d3ae1f1c5e04d91f743ec92766a3bc4c510dfa9722dc994df9e24063e not found: ID does not exist" Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.865419 4757 scope.go:117] "RemoveContainer" containerID="d587a5eac1622dd42ca2f2f7dfb8d045b1dbcb0e3da7ebeb67deb45a2cf26c9b" Dec 16 14:13:38 crc kubenswrapper[4757]: E1216 14:13:38.865750 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d587a5eac1622dd42ca2f2f7dfb8d045b1dbcb0e3da7ebeb67deb45a2cf26c9b\": container with ID starting with d587a5eac1622dd42ca2f2f7dfb8d045b1dbcb0e3da7ebeb67deb45a2cf26c9b not found: ID does not exist" containerID="d587a5eac1622dd42ca2f2f7dfb8d045b1dbcb0e3da7ebeb67deb45a2cf26c9b" Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.865769 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d587a5eac1622dd42ca2f2f7dfb8d045b1dbcb0e3da7ebeb67deb45a2cf26c9b"} err="failed to get container status \"d587a5eac1622dd42ca2f2f7dfb8d045b1dbcb0e3da7ebeb67deb45a2cf26c9b\": rpc error: code = NotFound desc = could not find container \"d587a5eac1622dd42ca2f2f7dfb8d045b1dbcb0e3da7ebeb67deb45a2cf26c9b\": container with ID starting with d587a5eac1622dd42ca2f2f7dfb8d045b1dbcb0e3da7ebeb67deb45a2cf26c9b not found: ID does not exist" Dec 16 14:13:38 crc kubenswrapper[4757]: I1216 14:13:38.958435 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a9a5919-e16e-4eec-8264-9d171635759e" path="/var/lib/kubelet/pods/7a9a5919-e16e-4eec-8264-9d171635759e/volumes" Dec 16 14:13:40 crc kubenswrapper[4757]: I1216 14:13:40.948852 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:13:40 crc kubenswrapper[4757]: E1216 14:13:40.949687 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:13:55 crc kubenswrapper[4757]: I1216 14:13:55.949355 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:13:55 crc kubenswrapper[4757]: E1216 14:13:55.950114 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:14:09 crc kubenswrapper[4757]: I1216 14:14:09.948688 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:14:09 crc kubenswrapper[4757]: E1216 14:14:09.949861 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:14:21 crc kubenswrapper[4757]: I1216 14:14:21.948906 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:14:22 crc kubenswrapper[4757]: I1216 14:14:22.734259 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"c86c64d9b5110260e0da79d803ebceec5b850120d3c283cb47799b3ce87c4098"} Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.179740 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk"] Dec 16 14:15:00 crc kubenswrapper[4757]: E1216 14:15:00.180882 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9a5919-e16e-4eec-8264-9d171635759e" containerName="extract-content" Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.180899 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9a5919-e16e-4eec-8264-9d171635759e" containerName="extract-content" Dec 16 14:15:00 crc kubenswrapper[4757]: E1216 14:15:00.180920 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9a5919-e16e-4eec-8264-9d171635759e" containerName="extract-utilities" Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.180929 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9a5919-e16e-4eec-8264-9d171635759e" containerName="extract-utilities" Dec 16 14:15:00 crc kubenswrapper[4757]: E1216 14:15:00.180957 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9a5919-e16e-4eec-8264-9d171635759e" containerName="registry-server" Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.180964 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9a5919-e16e-4eec-8264-9d171635759e" containerName="registry-server" Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.181227 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a9a5919-e16e-4eec-8264-9d171635759e" containerName="registry-server" Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.182117 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk" Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.186460 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.186707 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.192868 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk"] Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.269504 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c31bf3dc-bcac-42b8-9c98-37d4f669b6a5-secret-volume\") pod \"collect-profiles-29431575-5crjk\" (UID: \"c31bf3dc-bcac-42b8-9c98-37d4f669b6a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk" Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.269579 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c31bf3dc-bcac-42b8-9c98-37d4f669b6a5-config-volume\") pod \"collect-profiles-29431575-5crjk\" (UID: \"c31bf3dc-bcac-42b8-9c98-37d4f669b6a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk" Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.269624 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l47cj\" (UniqueName: \"kubernetes.io/projected/c31bf3dc-bcac-42b8-9c98-37d4f669b6a5-kube-api-access-l47cj\") pod \"collect-profiles-29431575-5crjk\" (UID: \"c31bf3dc-bcac-42b8-9c98-37d4f669b6a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk" Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.371094 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c31bf3dc-bcac-42b8-9c98-37d4f669b6a5-secret-volume\") pod \"collect-profiles-29431575-5crjk\" (UID: \"c31bf3dc-bcac-42b8-9c98-37d4f669b6a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk" Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.371182 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c31bf3dc-bcac-42b8-9c98-37d4f669b6a5-config-volume\") pod \"collect-profiles-29431575-5crjk\" (UID: \"c31bf3dc-bcac-42b8-9c98-37d4f669b6a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk" Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.371231 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l47cj\" (UniqueName: \"kubernetes.io/projected/c31bf3dc-bcac-42b8-9c98-37d4f669b6a5-kube-api-access-l47cj\") pod \"collect-profiles-29431575-5crjk\" (UID: \"c31bf3dc-bcac-42b8-9c98-37d4f669b6a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk" Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.372111 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c31bf3dc-bcac-42b8-9c98-37d4f669b6a5-config-volume\") pod \"collect-profiles-29431575-5crjk\" (UID: \"c31bf3dc-bcac-42b8-9c98-37d4f669b6a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk" Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.384087 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c31bf3dc-bcac-42b8-9c98-37d4f669b6a5-secret-volume\") pod \"collect-profiles-29431575-5crjk\" (UID: \"c31bf3dc-bcac-42b8-9c98-37d4f669b6a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk" Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.394600 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l47cj\" (UniqueName: \"kubernetes.io/projected/c31bf3dc-bcac-42b8-9c98-37d4f669b6a5-kube-api-access-l47cj\") pod \"collect-profiles-29431575-5crjk\" (UID: \"c31bf3dc-bcac-42b8-9c98-37d4f669b6a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk" Dec 16 14:15:00 crc kubenswrapper[4757]: I1216 14:15:00.516770 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk" Dec 16 14:15:01 crc kubenswrapper[4757]: I1216 14:15:01.018394 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk"] Dec 16 14:15:01 crc kubenswrapper[4757]: W1216 14:15:01.021105 4757 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc31bf3dc_bcac_42b8_9c98_37d4f669b6a5.slice/crio-549144ea73ed88d154ea50177eaa8dc956b2057ce2bbbbcfc1a15ef68a99ca8c WatchSource:0}: Error finding container 549144ea73ed88d154ea50177eaa8dc956b2057ce2bbbbcfc1a15ef68a99ca8c: Status 404 returned error can't find the container with id 549144ea73ed88d154ea50177eaa8dc956b2057ce2bbbbcfc1a15ef68a99ca8c Dec 16 14:15:01 crc kubenswrapper[4757]: I1216 14:15:01.113784 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk" event={"ID":"c31bf3dc-bcac-42b8-9c98-37d4f669b6a5","Type":"ContainerStarted","Data":"549144ea73ed88d154ea50177eaa8dc956b2057ce2bbbbcfc1a15ef68a99ca8c"} Dec 16 14:15:02 crc kubenswrapper[4757]: I1216 14:15:02.127120 4757 generic.go:334] "Generic (PLEG): container finished" podID="c31bf3dc-bcac-42b8-9c98-37d4f669b6a5" containerID="a96d6f55dadb2b520cf1e3149a5a1303ba520af48006d1b991c5e04ab8dc934f" exitCode=0 Dec 16 14:15:02 crc kubenswrapper[4757]: I1216 14:15:02.127457 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk" event={"ID":"c31bf3dc-bcac-42b8-9c98-37d4f669b6a5","Type":"ContainerDied","Data":"a96d6f55dadb2b520cf1e3149a5a1303ba520af48006d1b991c5e04ab8dc934f"} Dec 16 14:15:03 crc kubenswrapper[4757]: I1216 14:15:03.511235 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk" Dec 16 14:15:03 crc kubenswrapper[4757]: I1216 14:15:03.634492 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l47cj\" (UniqueName: \"kubernetes.io/projected/c31bf3dc-bcac-42b8-9c98-37d4f669b6a5-kube-api-access-l47cj\") pod \"c31bf3dc-bcac-42b8-9c98-37d4f669b6a5\" (UID: \"c31bf3dc-bcac-42b8-9c98-37d4f669b6a5\") " Dec 16 14:15:03 crc kubenswrapper[4757]: I1216 14:15:03.634632 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c31bf3dc-bcac-42b8-9c98-37d4f669b6a5-secret-volume\") pod \"c31bf3dc-bcac-42b8-9c98-37d4f669b6a5\" (UID: \"c31bf3dc-bcac-42b8-9c98-37d4f669b6a5\") " Dec 16 14:15:03 crc kubenswrapper[4757]: I1216 14:15:03.634745 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c31bf3dc-bcac-42b8-9c98-37d4f669b6a5-config-volume\") pod \"c31bf3dc-bcac-42b8-9c98-37d4f669b6a5\" (UID: \"c31bf3dc-bcac-42b8-9c98-37d4f669b6a5\") " Dec 16 14:15:03 crc kubenswrapper[4757]: I1216 14:15:03.652659 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c31bf3dc-bcac-42b8-9c98-37d4f669b6a5-config-volume" (OuterVolumeSpecName: "config-volume") pod "c31bf3dc-bcac-42b8-9c98-37d4f669b6a5" (UID: "c31bf3dc-bcac-42b8-9c98-37d4f669b6a5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:15:03 crc kubenswrapper[4757]: I1216 14:15:03.653576 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c31bf3dc-bcac-42b8-9c98-37d4f669b6a5-kube-api-access-l47cj" (OuterVolumeSpecName: "kube-api-access-l47cj") pod "c31bf3dc-bcac-42b8-9c98-37d4f669b6a5" (UID: "c31bf3dc-bcac-42b8-9c98-37d4f669b6a5"). InnerVolumeSpecName "kube-api-access-l47cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:15:03 crc kubenswrapper[4757]: I1216 14:15:03.659748 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31bf3dc-bcac-42b8-9c98-37d4f669b6a5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c31bf3dc-bcac-42b8-9c98-37d4f669b6a5" (UID: "c31bf3dc-bcac-42b8-9c98-37d4f669b6a5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:15:03 crc kubenswrapper[4757]: I1216 14:15:03.746150 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l47cj\" (UniqueName: \"kubernetes.io/projected/c31bf3dc-bcac-42b8-9c98-37d4f669b6a5-kube-api-access-l47cj\") on node \"crc\" DevicePath \"\"" Dec 16 14:15:03 crc kubenswrapper[4757]: I1216 14:15:03.746189 4757 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c31bf3dc-bcac-42b8-9c98-37d4f669b6a5-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 14:15:03 crc kubenswrapper[4757]: I1216 14:15:03.746200 4757 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c31bf3dc-bcac-42b8-9c98-37d4f669b6a5-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 14:15:04 crc kubenswrapper[4757]: I1216 14:15:04.146795 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk" event={"ID":"c31bf3dc-bcac-42b8-9c98-37d4f669b6a5","Type":"ContainerDied","Data":"549144ea73ed88d154ea50177eaa8dc956b2057ce2bbbbcfc1a15ef68a99ca8c"} Dec 16 14:15:04 crc kubenswrapper[4757]: I1216 14:15:04.146833 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="549144ea73ed88d154ea50177eaa8dc956b2057ce2bbbbcfc1a15ef68a99ca8c" Dec 16 14:15:04 crc kubenswrapper[4757]: I1216 14:15:04.146898 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431575-5crjk" Dec 16 14:15:04 crc kubenswrapper[4757]: I1216 14:15:04.598961 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2"] Dec 16 14:15:04 crc kubenswrapper[4757]: I1216 14:15:04.607388 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431530-wc7w2"] Dec 16 14:15:04 crc kubenswrapper[4757]: I1216 14:15:04.962073 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7278c05d-34ac-48b0-9c9f-14b6ac22d900" path="/var/lib/kubelet/pods/7278c05d-34ac-48b0-9c9f-14b6ac22d900/volumes" Dec 16 14:15:32 crc kubenswrapper[4757]: I1216 14:15:32.604922 4757 scope.go:117] "RemoveContainer" containerID="fe21512cc091fb603b503d260b393bd57d23330e2a8e1c4fd6b98adeed59fb54" Dec 16 14:15:32 crc kubenswrapper[4757]: I1216 14:15:32.682502 4757 scope.go:117] "RemoveContainer" containerID="b879f1b69035121ff9466ac2c66f8183d417e50badaba0378f22f47678c5a7c3" Dec 16 14:15:34 crc kubenswrapper[4757]: I1216 14:15:34.442983 4757 generic.go:334] "Generic (PLEG): container finished" podID="495e66e8-8d8a-43f8-a754-623e9cb354f5" containerID="4c72c9955a44cd877d092d62ce66e3bbaef2a40e668c79a3bcdd7d58e9e36f91" exitCode=0 Dec 16 14:15:34 crc kubenswrapper[4757]: I1216 14:15:34.443100 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gfbgg/must-gather-pw4zt" event={"ID":"495e66e8-8d8a-43f8-a754-623e9cb354f5","Type":"ContainerDied","Data":"4c72c9955a44cd877d092d62ce66e3bbaef2a40e668c79a3bcdd7d58e9e36f91"} Dec 16 14:15:34 crc kubenswrapper[4757]: I1216 14:15:34.443970 4757 scope.go:117] "RemoveContainer" containerID="4c72c9955a44cd877d092d62ce66e3bbaef2a40e668c79a3bcdd7d58e9e36f91" Dec 16 14:15:35 crc kubenswrapper[4757]: I1216 14:15:35.384296 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gfbgg_must-gather-pw4zt_495e66e8-8d8a-43f8-a754-623e9cb354f5/gather/0.log" Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.032702 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gfbgg/must-gather-pw4zt"] Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.033535 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-gfbgg/must-gather-pw4zt" podUID="495e66e8-8d8a-43f8-a754-623e9cb354f5" containerName="copy" containerID="cri-o://df95e352f2100f6abc6bc695b5510c29f67f3af70fc66fc59666770eb1373731" gracePeriod=2 Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.043460 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gfbgg/must-gather-pw4zt"] Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.449360 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gfbgg_must-gather-pw4zt_495e66e8-8d8a-43f8-a754-623e9cb354f5/copy/0.log" Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.450210 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfbgg/must-gather-pw4zt" Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.549825 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gfbgg_must-gather-pw4zt_495e66e8-8d8a-43f8-a754-623e9cb354f5/copy/0.log" Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.550276 4757 generic.go:334] "Generic (PLEG): container finished" podID="495e66e8-8d8a-43f8-a754-623e9cb354f5" containerID="df95e352f2100f6abc6bc695b5510c29f67f3af70fc66fc59666770eb1373731" exitCode=143 Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.550327 4757 scope.go:117] "RemoveContainer" containerID="df95e352f2100f6abc6bc695b5510c29f67f3af70fc66fc59666770eb1373731" Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.550337 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gfbgg/must-gather-pw4zt" Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.570741 4757 scope.go:117] "RemoveContainer" containerID="4c72c9955a44cd877d092d62ce66e3bbaef2a40e668c79a3bcdd7d58e9e36f91" Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.577623 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/495e66e8-8d8a-43f8-a754-623e9cb354f5-must-gather-output\") pod \"495e66e8-8d8a-43f8-a754-623e9cb354f5\" (UID: \"495e66e8-8d8a-43f8-a754-623e9cb354f5\") " Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.577804 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpgdq\" (UniqueName: \"kubernetes.io/projected/495e66e8-8d8a-43f8-a754-623e9cb354f5-kube-api-access-tpgdq\") pod \"495e66e8-8d8a-43f8-a754-623e9cb354f5\" (UID: \"495e66e8-8d8a-43f8-a754-623e9cb354f5\") " Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.586252 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/495e66e8-8d8a-43f8-a754-623e9cb354f5-kube-api-access-tpgdq" (OuterVolumeSpecName: "kube-api-access-tpgdq") pod "495e66e8-8d8a-43f8-a754-623e9cb354f5" (UID: "495e66e8-8d8a-43f8-a754-623e9cb354f5"). InnerVolumeSpecName "kube-api-access-tpgdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.613684 4757 scope.go:117] "RemoveContainer" containerID="df95e352f2100f6abc6bc695b5510c29f67f3af70fc66fc59666770eb1373731" Dec 16 14:15:44 crc kubenswrapper[4757]: E1216 14:15:44.615324 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df95e352f2100f6abc6bc695b5510c29f67f3af70fc66fc59666770eb1373731\": container with ID starting with df95e352f2100f6abc6bc695b5510c29f67f3af70fc66fc59666770eb1373731 not found: ID does not exist" containerID="df95e352f2100f6abc6bc695b5510c29f67f3af70fc66fc59666770eb1373731" Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.615427 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df95e352f2100f6abc6bc695b5510c29f67f3af70fc66fc59666770eb1373731"} err="failed to get container status \"df95e352f2100f6abc6bc695b5510c29f67f3af70fc66fc59666770eb1373731\": rpc error: code = NotFound desc = could not find container \"df95e352f2100f6abc6bc695b5510c29f67f3af70fc66fc59666770eb1373731\": container with ID starting with df95e352f2100f6abc6bc695b5510c29f67f3af70fc66fc59666770eb1373731 not found: ID does not exist" Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.615510 4757 scope.go:117] "RemoveContainer" containerID="4c72c9955a44cd877d092d62ce66e3bbaef2a40e668c79a3bcdd7d58e9e36f91" Dec 16 14:15:44 crc kubenswrapper[4757]: E1216 14:15:44.617256 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c72c9955a44cd877d092d62ce66e3bbaef2a40e668c79a3bcdd7d58e9e36f91\": container with ID starting with 4c72c9955a44cd877d092d62ce66e3bbaef2a40e668c79a3bcdd7d58e9e36f91 not found: ID does not exist" containerID="4c72c9955a44cd877d092d62ce66e3bbaef2a40e668c79a3bcdd7d58e9e36f91" Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.617361 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c72c9955a44cd877d092d62ce66e3bbaef2a40e668c79a3bcdd7d58e9e36f91"} err="failed to get container status \"4c72c9955a44cd877d092d62ce66e3bbaef2a40e668c79a3bcdd7d58e9e36f91\": rpc error: code = NotFound desc = could not find container \"4c72c9955a44cd877d092d62ce66e3bbaef2a40e668c79a3bcdd7d58e9e36f91\": container with ID starting with 4c72c9955a44cd877d092d62ce66e3bbaef2a40e668c79a3bcdd7d58e9e36f91 not found: ID does not exist" Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.682233 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpgdq\" (UniqueName: \"kubernetes.io/projected/495e66e8-8d8a-43f8-a754-623e9cb354f5-kube-api-access-tpgdq\") on node \"crc\" DevicePath \"\"" Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.795893 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/495e66e8-8d8a-43f8-a754-623e9cb354f5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "495e66e8-8d8a-43f8-a754-623e9cb354f5" (UID: "495e66e8-8d8a-43f8-a754-623e9cb354f5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.886688 4757 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/495e66e8-8d8a-43f8-a754-623e9cb354f5-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 16 14:15:44 crc kubenswrapper[4757]: I1216 14:15:44.959806 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="495e66e8-8d8a-43f8-a754-623e9cb354f5" path="/var/lib/kubelet/pods/495e66e8-8d8a-43f8-a754-623e9cb354f5/volumes" Dec 16 14:16:32 crc kubenswrapper[4757]: I1216 14:16:32.828706 4757 scope.go:117] "RemoveContainer" containerID="e8a33188027c4394c4a31ff9f6c573187f4ce38518036f88aa26ba20d4bfcf29" Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.458297 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bx94j"] Dec 16 14:16:35 crc kubenswrapper[4757]: E1216 14:16:35.459611 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="495e66e8-8d8a-43f8-a754-623e9cb354f5" containerName="gather" Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.459632 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="495e66e8-8d8a-43f8-a754-623e9cb354f5" containerName="gather" Dec 16 14:16:35 crc kubenswrapper[4757]: E1216 14:16:35.459668 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c31bf3dc-bcac-42b8-9c98-37d4f669b6a5" containerName="collect-profiles" Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.459681 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31bf3dc-bcac-42b8-9c98-37d4f669b6a5" containerName="collect-profiles" Dec 16 14:16:35 crc kubenswrapper[4757]: E1216 14:16:35.459712 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="495e66e8-8d8a-43f8-a754-623e9cb354f5" containerName="copy" Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.459727 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="495e66e8-8d8a-43f8-a754-623e9cb354f5" containerName="copy" Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.460086 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="495e66e8-8d8a-43f8-a754-623e9cb354f5" containerName="gather" Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.460126 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="c31bf3dc-bcac-42b8-9c98-37d4f669b6a5" containerName="collect-profiles" Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.460177 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="495e66e8-8d8a-43f8-a754-623e9cb354f5" containerName="copy" Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.462480 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bx94j" Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.469182 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bx94j"] Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.554946 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42856886-8337-4999-b58a-f147848e148d-utilities\") pod \"redhat-operators-bx94j\" (UID: \"42856886-8337-4999-b58a-f147848e148d\") " pod="openshift-marketplace/redhat-operators-bx94j" Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.555149 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx7ps\" (UniqueName: \"kubernetes.io/projected/42856886-8337-4999-b58a-f147848e148d-kube-api-access-zx7ps\") pod \"redhat-operators-bx94j\" (UID: \"42856886-8337-4999-b58a-f147848e148d\") " pod="openshift-marketplace/redhat-operators-bx94j" Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.555250 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42856886-8337-4999-b58a-f147848e148d-catalog-content\") pod \"redhat-operators-bx94j\" (UID: \"42856886-8337-4999-b58a-f147848e148d\") " pod="openshift-marketplace/redhat-operators-bx94j" Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.657758 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42856886-8337-4999-b58a-f147848e148d-utilities\") pod \"redhat-operators-bx94j\" (UID: \"42856886-8337-4999-b58a-f147848e148d\") " pod="openshift-marketplace/redhat-operators-bx94j" Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.658358 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42856886-8337-4999-b58a-f147848e148d-utilities\") pod \"redhat-operators-bx94j\" (UID: \"42856886-8337-4999-b58a-f147848e148d\") " pod="openshift-marketplace/redhat-operators-bx94j" Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.658867 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx7ps\" (UniqueName: \"kubernetes.io/projected/42856886-8337-4999-b58a-f147848e148d-kube-api-access-zx7ps\") pod \"redhat-operators-bx94j\" (UID: \"42856886-8337-4999-b58a-f147848e148d\") " pod="openshift-marketplace/redhat-operators-bx94j" Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.658960 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42856886-8337-4999-b58a-f147848e148d-catalog-content\") pod \"redhat-operators-bx94j\" (UID: \"42856886-8337-4999-b58a-f147848e148d\") " pod="openshift-marketplace/redhat-operators-bx94j" Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.659239 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42856886-8337-4999-b58a-f147848e148d-catalog-content\") pod \"redhat-operators-bx94j\" (UID: \"42856886-8337-4999-b58a-f147848e148d\") " pod="openshift-marketplace/redhat-operators-bx94j" Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.687936 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx7ps\" (UniqueName: \"kubernetes.io/projected/42856886-8337-4999-b58a-f147848e148d-kube-api-access-zx7ps\") pod \"redhat-operators-bx94j\" (UID: \"42856886-8337-4999-b58a-f147848e148d\") " pod="openshift-marketplace/redhat-operators-bx94j" Dec 16 14:16:35 crc kubenswrapper[4757]: I1216 14:16:35.841216 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bx94j" Dec 16 14:16:36 crc kubenswrapper[4757]: I1216 14:16:36.340755 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bx94j"] Dec 16 14:16:37 crc kubenswrapper[4757]: I1216 14:16:37.093658 4757 generic.go:334] "Generic (PLEG): container finished" podID="42856886-8337-4999-b58a-f147848e148d" containerID="8af0821f47fc02790b6bd9880875c0d2eaf140fdd94210c4f6290484090c85d1" exitCode=0 Dec 16 14:16:37 crc kubenswrapper[4757]: I1216 14:16:37.093713 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bx94j" event={"ID":"42856886-8337-4999-b58a-f147848e148d","Type":"ContainerDied","Data":"8af0821f47fc02790b6bd9880875c0d2eaf140fdd94210c4f6290484090c85d1"} Dec 16 14:16:37 crc kubenswrapper[4757]: I1216 14:16:37.094965 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bx94j" event={"ID":"42856886-8337-4999-b58a-f147848e148d","Type":"ContainerStarted","Data":"9560c2d66c1e6f03cc1cdee01b9109640c65208ad4cab8c43fe21b0b3471edee"} Dec 16 14:16:37 crc kubenswrapper[4757]: I1216 14:16:37.095903 4757 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 14:16:39 crc kubenswrapper[4757]: I1216 14:16:39.116527 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bx94j" event={"ID":"42856886-8337-4999-b58a-f147848e148d","Type":"ContainerStarted","Data":"ca9e81408dd7a295084bcd1cc63e82301e95513aacf6343371ac0e67bd2bbfa9"} Dec 16 14:16:42 crc kubenswrapper[4757]: I1216 14:16:42.187521 4757 generic.go:334] "Generic (PLEG): container finished" podID="42856886-8337-4999-b58a-f147848e148d" containerID="ca9e81408dd7a295084bcd1cc63e82301e95513aacf6343371ac0e67bd2bbfa9" exitCode=0 Dec 16 14:16:42 crc kubenswrapper[4757]: I1216 14:16:42.187630 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bx94j" event={"ID":"42856886-8337-4999-b58a-f147848e148d","Type":"ContainerDied","Data":"ca9e81408dd7a295084bcd1cc63e82301e95513aacf6343371ac0e67bd2bbfa9"} Dec 16 14:16:43 crc kubenswrapper[4757]: I1216 14:16:43.202870 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bx94j" event={"ID":"42856886-8337-4999-b58a-f147848e148d","Type":"ContainerStarted","Data":"9b097a88f79ba03199a14412a1dcfa1c0d9be3b9d79f19f9ecef948d75ae903c"} Dec 16 14:16:43 crc kubenswrapper[4757]: I1216 14:16:43.232691 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bx94j" podStartSLOduration=2.406731839 podStartE2EDuration="8.2326531s" podCreationTimestamp="2025-12-16 14:16:35 +0000 UTC" firstStartedPulling="2025-12-16 14:16:37.095587285 +0000 UTC m=+5382.523331091" lastFinishedPulling="2025-12-16 14:16:42.921508556 +0000 UTC m=+5388.349252352" observedRunningTime="2025-12-16 14:16:43.225557385 +0000 UTC m=+5388.653301211" watchObservedRunningTime="2025-12-16 14:16:43.2326531 +0000 UTC m=+5388.660396916" Dec 16 14:16:45 crc kubenswrapper[4757]: I1216 14:16:45.841981 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bx94j" Dec 16 14:16:45 crc kubenswrapper[4757]: I1216 14:16:45.842420 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bx94j" Dec 16 14:16:46 crc kubenswrapper[4757]: I1216 14:16:46.894320 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bx94j" podUID="42856886-8337-4999-b58a-f147848e148d" containerName="registry-server" probeResult="failure" output=< Dec 16 14:16:46 crc kubenswrapper[4757]: timeout: failed to connect service ":50051" within 1s Dec 16 14:16:46 crc kubenswrapper[4757]: > Dec 16 14:16:51 crc kubenswrapper[4757]: I1216 14:16:51.181376 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 14:16:51 crc kubenswrapper[4757]: I1216 14:16:51.182027 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 14:16:55 crc kubenswrapper[4757]: I1216 14:16:55.905924 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bx94j" Dec 16 14:16:55 crc kubenswrapper[4757]: I1216 14:16:55.966360 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bx94j" Dec 16 14:16:56 crc kubenswrapper[4757]: I1216 14:16:56.143731 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bx94j"] Dec 16 14:16:57 crc kubenswrapper[4757]: I1216 14:16:57.356916 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bx94j" podUID="42856886-8337-4999-b58a-f147848e148d" containerName="registry-server" containerID="cri-o://9b097a88f79ba03199a14412a1dcfa1c0d9be3b9d79f19f9ecef948d75ae903c" gracePeriod=2 Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.187330 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bx94j" Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.230309 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42856886-8337-4999-b58a-f147848e148d-utilities\") pod \"42856886-8337-4999-b58a-f147848e148d\" (UID: \"42856886-8337-4999-b58a-f147848e148d\") " Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.230399 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42856886-8337-4999-b58a-f147848e148d-catalog-content\") pod \"42856886-8337-4999-b58a-f147848e148d\" (UID: \"42856886-8337-4999-b58a-f147848e148d\") " Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.230424 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx7ps\" (UniqueName: \"kubernetes.io/projected/42856886-8337-4999-b58a-f147848e148d-kube-api-access-zx7ps\") pod \"42856886-8337-4999-b58a-f147848e148d\" (UID: \"42856886-8337-4999-b58a-f147848e148d\") " Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.231652 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42856886-8337-4999-b58a-f147848e148d-utilities" (OuterVolumeSpecName: "utilities") pod "42856886-8337-4999-b58a-f147848e148d" (UID: "42856886-8337-4999-b58a-f147848e148d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.238294 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42856886-8337-4999-b58a-f147848e148d-kube-api-access-zx7ps" (OuterVolumeSpecName: "kube-api-access-zx7ps") pod "42856886-8337-4999-b58a-f147848e148d" (UID: "42856886-8337-4999-b58a-f147848e148d"). InnerVolumeSpecName "kube-api-access-zx7ps". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.332086 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42856886-8337-4999-b58a-f147848e148d-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.332127 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx7ps\" (UniqueName: \"kubernetes.io/projected/42856886-8337-4999-b58a-f147848e148d-kube-api-access-zx7ps\") on node \"crc\" DevicePath \"\"" Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.340685 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42856886-8337-4999-b58a-f147848e148d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42856886-8337-4999-b58a-f147848e148d" (UID: "42856886-8337-4999-b58a-f147848e148d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.370683 4757 generic.go:334] "Generic (PLEG): container finished" podID="42856886-8337-4999-b58a-f147848e148d" containerID="9b097a88f79ba03199a14412a1dcfa1c0d9be3b9d79f19f9ecef948d75ae903c" exitCode=0 Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.370730 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bx94j" event={"ID":"42856886-8337-4999-b58a-f147848e148d","Type":"ContainerDied","Data":"9b097a88f79ba03199a14412a1dcfa1c0d9be3b9d79f19f9ecef948d75ae903c"} Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.370771 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bx94j" event={"ID":"42856886-8337-4999-b58a-f147848e148d","Type":"ContainerDied","Data":"9560c2d66c1e6f03cc1cdee01b9109640c65208ad4cab8c43fe21b0b3471edee"} Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.370788 4757 scope.go:117] "RemoveContainer" containerID="9b097a88f79ba03199a14412a1dcfa1c0d9be3b9d79f19f9ecef948d75ae903c" Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.370898 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bx94j" Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.401631 4757 scope.go:117] "RemoveContainer" containerID="ca9e81408dd7a295084bcd1cc63e82301e95513aacf6343371ac0e67bd2bbfa9" Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.417142 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bx94j"] Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.428339 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bx94j"] Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.430901 4757 scope.go:117] "RemoveContainer" containerID="8af0821f47fc02790b6bd9880875c0d2eaf140fdd94210c4f6290484090c85d1" Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.433598 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42856886-8337-4999-b58a-f147848e148d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.470485 4757 scope.go:117] "RemoveContainer" containerID="9b097a88f79ba03199a14412a1dcfa1c0d9be3b9d79f19f9ecef948d75ae903c" Dec 16 14:16:58 crc kubenswrapper[4757]: E1216 14:16:58.470869 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b097a88f79ba03199a14412a1dcfa1c0d9be3b9d79f19f9ecef948d75ae903c\": container with ID starting with 9b097a88f79ba03199a14412a1dcfa1c0d9be3b9d79f19f9ecef948d75ae903c not found: ID does not exist" containerID="9b097a88f79ba03199a14412a1dcfa1c0d9be3b9d79f19f9ecef948d75ae903c" Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.470955 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b097a88f79ba03199a14412a1dcfa1c0d9be3b9d79f19f9ecef948d75ae903c"} err="failed to get container status \"9b097a88f79ba03199a14412a1dcfa1c0d9be3b9d79f19f9ecef948d75ae903c\": rpc error: code = NotFound desc = could not find container \"9b097a88f79ba03199a14412a1dcfa1c0d9be3b9d79f19f9ecef948d75ae903c\": container with ID starting with 9b097a88f79ba03199a14412a1dcfa1c0d9be3b9d79f19f9ecef948d75ae903c not found: ID does not exist" Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.471061 4757 scope.go:117] "RemoveContainer" containerID="ca9e81408dd7a295084bcd1cc63e82301e95513aacf6343371ac0e67bd2bbfa9" Dec 16 14:16:58 crc kubenswrapper[4757]: E1216 14:16:58.471365 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca9e81408dd7a295084bcd1cc63e82301e95513aacf6343371ac0e67bd2bbfa9\": container with ID starting with ca9e81408dd7a295084bcd1cc63e82301e95513aacf6343371ac0e67bd2bbfa9 not found: ID does not exist" containerID="ca9e81408dd7a295084bcd1cc63e82301e95513aacf6343371ac0e67bd2bbfa9" Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.471450 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9e81408dd7a295084bcd1cc63e82301e95513aacf6343371ac0e67bd2bbfa9"} err="failed to get container status \"ca9e81408dd7a295084bcd1cc63e82301e95513aacf6343371ac0e67bd2bbfa9\": rpc error: code = NotFound desc = could not find container \"ca9e81408dd7a295084bcd1cc63e82301e95513aacf6343371ac0e67bd2bbfa9\": container with ID starting with ca9e81408dd7a295084bcd1cc63e82301e95513aacf6343371ac0e67bd2bbfa9 not found: ID does not exist" Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.471513 4757 scope.go:117] "RemoveContainer" containerID="8af0821f47fc02790b6bd9880875c0d2eaf140fdd94210c4f6290484090c85d1" Dec 16 14:16:58 crc kubenswrapper[4757]: E1216 14:16:58.471756 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8af0821f47fc02790b6bd9880875c0d2eaf140fdd94210c4f6290484090c85d1\": container with ID starting with 8af0821f47fc02790b6bd9880875c0d2eaf140fdd94210c4f6290484090c85d1 not found: ID does not exist" containerID="8af0821f47fc02790b6bd9880875c0d2eaf140fdd94210c4f6290484090c85d1" Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.471853 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af0821f47fc02790b6bd9880875c0d2eaf140fdd94210c4f6290484090c85d1"} err="failed to get container status \"8af0821f47fc02790b6bd9880875c0d2eaf140fdd94210c4f6290484090c85d1\": rpc error: code = NotFound desc = could not find container \"8af0821f47fc02790b6bd9880875c0d2eaf140fdd94210c4f6290484090c85d1\": container with ID starting with 8af0821f47fc02790b6bd9880875c0d2eaf140fdd94210c4f6290484090c85d1 not found: ID does not exist" Dec 16 14:16:58 crc kubenswrapper[4757]: I1216 14:16:58.959995 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42856886-8337-4999-b58a-f147848e148d" path="/var/lib/kubelet/pods/42856886-8337-4999-b58a-f147848e148d/volumes" Dec 16 14:17:21 crc kubenswrapper[4757]: I1216 14:17:21.180815 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 14:17:21 crc kubenswrapper[4757]: I1216 14:17:21.181408 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 14:17:51 crc kubenswrapper[4757]: I1216 14:17:51.180865 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 14:17:51 crc kubenswrapper[4757]: I1216 14:17:51.181584 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 14:17:51 crc kubenswrapper[4757]: I1216 14:17:51.181654 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 14:17:51 crc kubenswrapper[4757]: I1216 14:17:51.182500 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c86c64d9b5110260e0da79d803ebceec5b850120d3c283cb47799b3ce87c4098"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 14:17:51 crc kubenswrapper[4757]: I1216 14:17:51.182588 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://c86c64d9b5110260e0da79d803ebceec5b850120d3c283cb47799b3ce87c4098" gracePeriod=600 Dec 16 14:17:51 crc kubenswrapper[4757]: E1216 14:17:51.426476 4757 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43be7319_eac3_4e51_9560_e12d51e97ca6.slice/crio-conmon-c86c64d9b5110260e0da79d803ebceec5b850120d3c283cb47799b3ce87c4098.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43be7319_eac3_4e51_9560_e12d51e97ca6.slice/crio-c86c64d9b5110260e0da79d803ebceec5b850120d3c283cb47799b3ce87c4098.scope\": RecentStats: unable to find data in memory cache]" Dec 16 14:17:51 crc kubenswrapper[4757]: I1216 14:17:51.903148 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="c86c64d9b5110260e0da79d803ebceec5b850120d3c283cb47799b3ce87c4098" exitCode=0 Dec 16 14:17:51 crc kubenswrapper[4757]: I1216 14:17:51.903180 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"c86c64d9b5110260e0da79d803ebceec5b850120d3c283cb47799b3ce87c4098"} Dec 16 14:17:51 crc kubenswrapper[4757]: I1216 14:17:51.903654 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1"} Dec 16 14:17:51 crc kubenswrapper[4757]: I1216 14:17:51.903709 4757 scope.go:117] "RemoveContainer" containerID="2524dbbddd37d3332355141bca5fa0ee7af9a3ebf16d313cba8d5c669e0c33ba" Dec 16 14:18:27 crc kubenswrapper[4757]: I1216 14:18:27.587002 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l4fk5"] Dec 16 14:18:27 crc kubenswrapper[4757]: E1216 14:18:27.588570 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42856886-8337-4999-b58a-f147848e148d" containerName="extract-content" Dec 16 14:18:27 crc kubenswrapper[4757]: I1216 14:18:27.588604 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="42856886-8337-4999-b58a-f147848e148d" containerName="extract-content" Dec 16 14:18:27 crc kubenswrapper[4757]: E1216 14:18:27.588632 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42856886-8337-4999-b58a-f147848e148d" containerName="extract-utilities" Dec 16 14:18:27 crc kubenswrapper[4757]: I1216 14:18:27.588649 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="42856886-8337-4999-b58a-f147848e148d" containerName="extract-utilities" Dec 16 14:18:27 crc kubenswrapper[4757]: E1216 14:18:27.588696 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42856886-8337-4999-b58a-f147848e148d" containerName="registry-server" Dec 16 14:18:27 crc kubenswrapper[4757]: I1216 14:18:27.588713 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="42856886-8337-4999-b58a-f147848e148d" containerName="registry-server" Dec 16 14:18:27 crc kubenswrapper[4757]: I1216 14:18:27.591413 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="42856886-8337-4999-b58a-f147848e148d" containerName="registry-server" Dec 16 14:18:27 crc kubenswrapper[4757]: I1216 14:18:27.594131 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l4fk5" Dec 16 14:18:27 crc kubenswrapper[4757]: I1216 14:18:27.602128 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l4fk5"] Dec 16 14:18:27 crc kubenswrapper[4757]: I1216 14:18:27.708597 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f434b49e-1f3a-4330-82be-416a796fcec2-catalog-content\") pod \"redhat-marketplace-l4fk5\" (UID: \"f434b49e-1f3a-4330-82be-416a796fcec2\") " pod="openshift-marketplace/redhat-marketplace-l4fk5" Dec 16 14:18:27 crc kubenswrapper[4757]: I1216 14:18:27.708674 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r6x8\" (UniqueName: \"kubernetes.io/projected/f434b49e-1f3a-4330-82be-416a796fcec2-kube-api-access-7r6x8\") pod \"redhat-marketplace-l4fk5\" (UID: \"f434b49e-1f3a-4330-82be-416a796fcec2\") " pod="openshift-marketplace/redhat-marketplace-l4fk5" Dec 16 14:18:27 crc kubenswrapper[4757]: I1216 14:18:27.708719 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f434b49e-1f3a-4330-82be-416a796fcec2-utilities\") pod \"redhat-marketplace-l4fk5\" (UID: \"f434b49e-1f3a-4330-82be-416a796fcec2\") " pod="openshift-marketplace/redhat-marketplace-l4fk5" Dec 16 14:18:27 crc kubenswrapper[4757]: I1216 14:18:27.810715 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f434b49e-1f3a-4330-82be-416a796fcec2-catalog-content\") pod \"redhat-marketplace-l4fk5\" (UID: \"f434b49e-1f3a-4330-82be-416a796fcec2\") " pod="openshift-marketplace/redhat-marketplace-l4fk5" Dec 16 14:18:27 crc kubenswrapper[4757]: I1216 14:18:27.810789 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r6x8\" (UniqueName: \"kubernetes.io/projected/f434b49e-1f3a-4330-82be-416a796fcec2-kube-api-access-7r6x8\") pod \"redhat-marketplace-l4fk5\" (UID: \"f434b49e-1f3a-4330-82be-416a796fcec2\") " pod="openshift-marketplace/redhat-marketplace-l4fk5" Dec 16 14:18:27 crc kubenswrapper[4757]: I1216 14:18:27.810830 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f434b49e-1f3a-4330-82be-416a796fcec2-utilities\") pod \"redhat-marketplace-l4fk5\" (UID: \"f434b49e-1f3a-4330-82be-416a796fcec2\") " pod="openshift-marketplace/redhat-marketplace-l4fk5" Dec 16 14:18:27 crc kubenswrapper[4757]: I1216 14:18:27.811176 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f434b49e-1f3a-4330-82be-416a796fcec2-catalog-content\") pod \"redhat-marketplace-l4fk5\" (UID: \"f434b49e-1f3a-4330-82be-416a796fcec2\") " pod="openshift-marketplace/redhat-marketplace-l4fk5" Dec 16 14:18:27 crc kubenswrapper[4757]: I1216 14:18:27.811265 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f434b49e-1f3a-4330-82be-416a796fcec2-utilities\") pod \"redhat-marketplace-l4fk5\" (UID: \"f434b49e-1f3a-4330-82be-416a796fcec2\") " pod="openshift-marketplace/redhat-marketplace-l4fk5" Dec 16 14:18:27 crc kubenswrapper[4757]: I1216 14:18:27.838421 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r6x8\" (UniqueName: \"kubernetes.io/projected/f434b49e-1f3a-4330-82be-416a796fcec2-kube-api-access-7r6x8\") pod \"redhat-marketplace-l4fk5\" (UID: \"f434b49e-1f3a-4330-82be-416a796fcec2\") " pod="openshift-marketplace/redhat-marketplace-l4fk5" Dec 16 14:18:27 crc kubenswrapper[4757]: I1216 14:18:27.937708 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l4fk5" Dec 16 14:18:28 crc kubenswrapper[4757]: I1216 14:18:28.429710 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l4fk5"] Dec 16 14:18:29 crc kubenswrapper[4757]: I1216 14:18:29.294383 4757 generic.go:334] "Generic (PLEG): container finished" podID="f434b49e-1f3a-4330-82be-416a796fcec2" containerID="ee4fc7b3919fa49789f29122174bd5c303ddcffdb84a9e4fe006bda57611c5da" exitCode=0 Dec 16 14:18:29 crc kubenswrapper[4757]: I1216 14:18:29.294429 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l4fk5" event={"ID":"f434b49e-1f3a-4330-82be-416a796fcec2","Type":"ContainerDied","Data":"ee4fc7b3919fa49789f29122174bd5c303ddcffdb84a9e4fe006bda57611c5da"} Dec 16 14:18:29 crc kubenswrapper[4757]: I1216 14:18:29.294653 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l4fk5" event={"ID":"f434b49e-1f3a-4330-82be-416a796fcec2","Type":"ContainerStarted","Data":"428451f0d55f06c665b52d1169d208586d6377d80e27743d289efc441c091078"} Dec 16 14:18:29 crc kubenswrapper[4757]: I1216 14:18:29.982348 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gjczl"] Dec 16 14:18:29 crc kubenswrapper[4757]: I1216 14:18:29.985692 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjczl" Dec 16 14:18:30 crc kubenswrapper[4757]: I1216 14:18:30.002549 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gjczl"] Dec 16 14:18:30 crc kubenswrapper[4757]: I1216 14:18:30.052618 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d8t4\" (UniqueName: \"kubernetes.io/projected/124df3fc-94ac-4c49-8cea-99fb4e91ea8e-kube-api-access-8d8t4\") pod \"community-operators-gjczl\" (UID: \"124df3fc-94ac-4c49-8cea-99fb4e91ea8e\") " pod="openshift-marketplace/community-operators-gjczl" Dec 16 14:18:30 crc kubenswrapper[4757]: I1216 14:18:30.052722 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124df3fc-94ac-4c49-8cea-99fb4e91ea8e-utilities\") pod \"community-operators-gjczl\" (UID: \"124df3fc-94ac-4c49-8cea-99fb4e91ea8e\") " pod="openshift-marketplace/community-operators-gjczl" Dec 16 14:18:30 crc kubenswrapper[4757]: I1216 14:18:30.052773 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124df3fc-94ac-4c49-8cea-99fb4e91ea8e-catalog-content\") pod \"community-operators-gjczl\" (UID: \"124df3fc-94ac-4c49-8cea-99fb4e91ea8e\") " pod="openshift-marketplace/community-operators-gjczl" Dec 16 14:18:30 crc kubenswrapper[4757]: I1216 14:18:30.154754 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d8t4\" (UniqueName: \"kubernetes.io/projected/124df3fc-94ac-4c49-8cea-99fb4e91ea8e-kube-api-access-8d8t4\") pod \"community-operators-gjczl\" (UID: \"124df3fc-94ac-4c49-8cea-99fb4e91ea8e\") " pod="openshift-marketplace/community-operators-gjczl" Dec 16 14:18:30 crc kubenswrapper[4757]: I1216 14:18:30.155451 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124df3fc-94ac-4c49-8cea-99fb4e91ea8e-utilities\") pod \"community-operators-gjczl\" (UID: \"124df3fc-94ac-4c49-8cea-99fb4e91ea8e\") " pod="openshift-marketplace/community-operators-gjczl" Dec 16 14:18:30 crc kubenswrapper[4757]: I1216 14:18:30.155973 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124df3fc-94ac-4c49-8cea-99fb4e91ea8e-catalog-content\") pod \"community-operators-gjczl\" (UID: \"124df3fc-94ac-4c49-8cea-99fb4e91ea8e\") " pod="openshift-marketplace/community-operators-gjczl" Dec 16 14:18:30 crc kubenswrapper[4757]: I1216 14:18:30.155914 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124df3fc-94ac-4c49-8cea-99fb4e91ea8e-utilities\") pod \"community-operators-gjczl\" (UID: \"124df3fc-94ac-4c49-8cea-99fb4e91ea8e\") " pod="openshift-marketplace/community-operators-gjczl" Dec 16 14:18:30 crc kubenswrapper[4757]: I1216 14:18:30.156280 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124df3fc-94ac-4c49-8cea-99fb4e91ea8e-catalog-content\") pod \"community-operators-gjczl\" (UID: \"124df3fc-94ac-4c49-8cea-99fb4e91ea8e\") " pod="openshift-marketplace/community-operators-gjczl" Dec 16 14:18:30 crc kubenswrapper[4757]: I1216 14:18:30.180168 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d8t4\" (UniqueName: \"kubernetes.io/projected/124df3fc-94ac-4c49-8cea-99fb4e91ea8e-kube-api-access-8d8t4\") pod \"community-operators-gjczl\" (UID: \"124df3fc-94ac-4c49-8cea-99fb4e91ea8e\") " pod="openshift-marketplace/community-operators-gjczl" Dec 16 14:18:30 crc kubenswrapper[4757]: I1216 14:18:30.308864 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjczl" Dec 16 14:18:31 crc kubenswrapper[4757]: I1216 14:18:31.051128 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gjczl"] Dec 16 14:18:31 crc kubenswrapper[4757]: I1216 14:18:31.323844 4757 generic.go:334] "Generic (PLEG): container finished" podID="f434b49e-1f3a-4330-82be-416a796fcec2" containerID="68dc545dd27fe507d8e84f21e6eff909b6b781330316b6eda73adb98038f78e6" exitCode=0 Dec 16 14:18:31 crc kubenswrapper[4757]: I1216 14:18:31.324013 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l4fk5" event={"ID":"f434b49e-1f3a-4330-82be-416a796fcec2","Type":"ContainerDied","Data":"68dc545dd27fe507d8e84f21e6eff909b6b781330316b6eda73adb98038f78e6"} Dec 16 14:18:31 crc kubenswrapper[4757]: I1216 14:18:31.325279 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjczl" event={"ID":"124df3fc-94ac-4c49-8cea-99fb4e91ea8e","Type":"ContainerStarted","Data":"fc89818117c481b24dd14b4b48147c091b57f345acfdbfd021298e6bf2cd3e8f"} Dec 16 14:18:32 crc kubenswrapper[4757]: I1216 14:18:32.333837 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l4fk5" event={"ID":"f434b49e-1f3a-4330-82be-416a796fcec2","Type":"ContainerStarted","Data":"3656bce7a31a89a47b8ab10bbc65505dcc7b5a396139276703edea85c257dbac"} Dec 16 14:18:32 crc kubenswrapper[4757]: I1216 14:18:32.335698 4757 generic.go:334] "Generic (PLEG): container finished" podID="124df3fc-94ac-4c49-8cea-99fb4e91ea8e" containerID="6f4f9fd667d741eb469d5732e9a64b6ea8bb77c4cb6260bd22e3c7749f6b00c3" exitCode=0 Dec 16 14:18:32 crc kubenswrapper[4757]: I1216 14:18:32.335738 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjczl" event={"ID":"124df3fc-94ac-4c49-8cea-99fb4e91ea8e","Type":"ContainerDied","Data":"6f4f9fd667d741eb469d5732e9a64b6ea8bb77c4cb6260bd22e3c7749f6b00c3"} Dec 16 14:18:32 crc kubenswrapper[4757]: I1216 14:18:32.371174 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l4fk5" podStartSLOduration=2.858944914 podStartE2EDuration="5.371155809s" podCreationTimestamp="2025-12-16 14:18:27 +0000 UTC" firstStartedPulling="2025-12-16 14:18:29.297492036 +0000 UTC m=+5494.725235832" lastFinishedPulling="2025-12-16 14:18:31.809702931 +0000 UTC m=+5497.237446727" observedRunningTime="2025-12-16 14:18:32.370648037 +0000 UTC m=+5497.798391853" watchObservedRunningTime="2025-12-16 14:18:32.371155809 +0000 UTC m=+5497.798899605" Dec 16 14:18:34 crc kubenswrapper[4757]: I1216 14:18:34.363860 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjczl" event={"ID":"124df3fc-94ac-4c49-8cea-99fb4e91ea8e","Type":"ContainerStarted","Data":"6c6bcac81b181b9fe3b4f79200ba93590b5096ad42b26f3234cbb4c353ba9af8"} Dec 16 14:18:36 crc kubenswrapper[4757]: I1216 14:18:36.424645 4757 generic.go:334] "Generic (PLEG): container finished" podID="124df3fc-94ac-4c49-8cea-99fb4e91ea8e" containerID="6c6bcac81b181b9fe3b4f79200ba93590b5096ad42b26f3234cbb4c353ba9af8" exitCode=0 Dec 16 14:18:36 crc kubenswrapper[4757]: I1216 14:18:36.424717 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjczl" event={"ID":"124df3fc-94ac-4c49-8cea-99fb4e91ea8e","Type":"ContainerDied","Data":"6c6bcac81b181b9fe3b4f79200ba93590b5096ad42b26f3234cbb4c353ba9af8"} Dec 16 14:18:37 crc kubenswrapper[4757]: I1216 14:18:37.437597 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjczl" event={"ID":"124df3fc-94ac-4c49-8cea-99fb4e91ea8e","Type":"ContainerStarted","Data":"8a650212ed043e5e96968f2586dec7b7cf1083518e05dc5f38f947a297b5b0e7"} Dec 16 14:18:37 crc kubenswrapper[4757]: I1216 14:18:37.462403 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gjczl" podStartSLOduration=3.943890397 podStartE2EDuration="8.462386518s" podCreationTimestamp="2025-12-16 14:18:29 +0000 UTC" firstStartedPulling="2025-12-16 14:18:32.336808263 +0000 UTC m=+5497.764552059" lastFinishedPulling="2025-12-16 14:18:36.855304384 +0000 UTC m=+5502.283048180" observedRunningTime="2025-12-16 14:18:37.457896397 +0000 UTC m=+5502.885640193" watchObservedRunningTime="2025-12-16 14:18:37.462386518 +0000 UTC m=+5502.890130314" Dec 16 14:18:37 crc kubenswrapper[4757]: I1216 14:18:37.938413 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l4fk5" Dec 16 14:18:37 crc kubenswrapper[4757]: I1216 14:18:37.938810 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l4fk5" Dec 16 14:18:37 crc kubenswrapper[4757]: I1216 14:18:37.989175 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l4fk5" Dec 16 14:18:38 crc kubenswrapper[4757]: I1216 14:18:38.505274 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l4fk5" Dec 16 14:18:40 crc kubenswrapper[4757]: I1216 14:18:40.309786 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gjczl" Dec 16 14:18:40 crc kubenswrapper[4757]: I1216 14:18:40.310055 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gjczl" Dec 16 14:18:40 crc kubenswrapper[4757]: I1216 14:18:40.372308 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gjczl" Dec 16 14:18:41 crc kubenswrapper[4757]: I1216 14:18:41.162508 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l4fk5"] Dec 16 14:18:41 crc kubenswrapper[4757]: I1216 14:18:41.162749 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l4fk5" podUID="f434b49e-1f3a-4330-82be-416a796fcec2" containerName="registry-server" containerID="cri-o://3656bce7a31a89a47b8ab10bbc65505dcc7b5a396139276703edea85c257dbac" gracePeriod=2 Dec 16 14:18:41 crc kubenswrapper[4757]: I1216 14:18:41.489237 4757 generic.go:334] "Generic (PLEG): container finished" podID="f434b49e-1f3a-4330-82be-416a796fcec2" containerID="3656bce7a31a89a47b8ab10bbc65505dcc7b5a396139276703edea85c257dbac" exitCode=0 Dec 16 14:18:41 crc kubenswrapper[4757]: I1216 14:18:41.489515 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l4fk5" event={"ID":"f434b49e-1f3a-4330-82be-416a796fcec2","Type":"ContainerDied","Data":"3656bce7a31a89a47b8ab10bbc65505dcc7b5a396139276703edea85c257dbac"} Dec 16 14:18:41 crc kubenswrapper[4757]: I1216 14:18:41.596971 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l4fk5" Dec 16 14:18:41 crc kubenswrapper[4757]: I1216 14:18:41.696694 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r6x8\" (UniqueName: \"kubernetes.io/projected/f434b49e-1f3a-4330-82be-416a796fcec2-kube-api-access-7r6x8\") pod \"f434b49e-1f3a-4330-82be-416a796fcec2\" (UID: \"f434b49e-1f3a-4330-82be-416a796fcec2\") " Dec 16 14:18:41 crc kubenswrapper[4757]: I1216 14:18:41.696743 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f434b49e-1f3a-4330-82be-416a796fcec2-catalog-content\") pod \"f434b49e-1f3a-4330-82be-416a796fcec2\" (UID: \"f434b49e-1f3a-4330-82be-416a796fcec2\") " Dec 16 14:18:41 crc kubenswrapper[4757]: I1216 14:18:41.696824 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f434b49e-1f3a-4330-82be-416a796fcec2-utilities\") pod \"f434b49e-1f3a-4330-82be-416a796fcec2\" (UID: \"f434b49e-1f3a-4330-82be-416a796fcec2\") " Dec 16 14:18:41 crc kubenswrapper[4757]: I1216 14:18:41.698060 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f434b49e-1f3a-4330-82be-416a796fcec2-utilities" (OuterVolumeSpecName: "utilities") pod "f434b49e-1f3a-4330-82be-416a796fcec2" (UID: "f434b49e-1f3a-4330-82be-416a796fcec2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:18:41 crc kubenswrapper[4757]: I1216 14:18:41.702477 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f434b49e-1f3a-4330-82be-416a796fcec2-kube-api-access-7r6x8" (OuterVolumeSpecName: "kube-api-access-7r6x8") pod "f434b49e-1f3a-4330-82be-416a796fcec2" (UID: "f434b49e-1f3a-4330-82be-416a796fcec2"). InnerVolumeSpecName "kube-api-access-7r6x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:18:41 crc kubenswrapper[4757]: I1216 14:18:41.719603 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f434b49e-1f3a-4330-82be-416a796fcec2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f434b49e-1f3a-4330-82be-416a796fcec2" (UID: "f434b49e-1f3a-4330-82be-416a796fcec2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:18:41 crc kubenswrapper[4757]: I1216 14:18:41.798370 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f434b49e-1f3a-4330-82be-416a796fcec2-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:18:41 crc kubenswrapper[4757]: I1216 14:18:41.798400 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r6x8\" (UniqueName: \"kubernetes.io/projected/f434b49e-1f3a-4330-82be-416a796fcec2-kube-api-access-7r6x8\") on node \"crc\" DevicePath \"\"" Dec 16 14:18:41 crc kubenswrapper[4757]: I1216 14:18:41.798410 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f434b49e-1f3a-4330-82be-416a796fcec2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:18:42 crc kubenswrapper[4757]: I1216 14:18:42.505202 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l4fk5" event={"ID":"f434b49e-1f3a-4330-82be-416a796fcec2","Type":"ContainerDied","Data":"428451f0d55f06c665b52d1169d208586d6377d80e27743d289efc441c091078"} Dec 16 14:18:42 crc kubenswrapper[4757]: I1216 14:18:42.505641 4757 scope.go:117] "RemoveContainer" containerID="3656bce7a31a89a47b8ab10bbc65505dcc7b5a396139276703edea85c257dbac" Dec 16 14:18:42 crc kubenswrapper[4757]: I1216 14:18:42.505516 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l4fk5" Dec 16 14:18:42 crc kubenswrapper[4757]: I1216 14:18:42.550297 4757 scope.go:117] "RemoveContainer" containerID="68dc545dd27fe507d8e84f21e6eff909b6b781330316b6eda73adb98038f78e6" Dec 16 14:18:42 crc kubenswrapper[4757]: I1216 14:18:42.576365 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l4fk5"] Dec 16 14:18:42 crc kubenswrapper[4757]: I1216 14:18:42.586839 4757 scope.go:117] "RemoveContainer" containerID="ee4fc7b3919fa49789f29122174bd5c303ddcffdb84a9e4fe006bda57611c5da" Dec 16 14:18:42 crc kubenswrapper[4757]: I1216 14:18:42.589335 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l4fk5"] Dec 16 14:18:42 crc kubenswrapper[4757]: I1216 14:18:42.972696 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f434b49e-1f3a-4330-82be-416a796fcec2" path="/var/lib/kubelet/pods/f434b49e-1f3a-4330-82be-416a796fcec2/volumes" Dec 16 14:18:50 crc kubenswrapper[4757]: I1216 14:18:50.376366 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gjczl" Dec 16 14:18:50 crc kubenswrapper[4757]: I1216 14:18:50.439785 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gjczl"] Dec 16 14:18:50 crc kubenswrapper[4757]: I1216 14:18:50.586816 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gjczl" podUID="124df3fc-94ac-4c49-8cea-99fb4e91ea8e" containerName="registry-server" containerID="cri-o://8a650212ed043e5e96968f2586dec7b7cf1083518e05dc5f38f947a297b5b0e7" gracePeriod=2 Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.083777 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjczl" Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.154511 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d8t4\" (UniqueName: \"kubernetes.io/projected/124df3fc-94ac-4c49-8cea-99fb4e91ea8e-kube-api-access-8d8t4\") pod \"124df3fc-94ac-4c49-8cea-99fb4e91ea8e\" (UID: \"124df3fc-94ac-4c49-8cea-99fb4e91ea8e\") " Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.154557 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124df3fc-94ac-4c49-8cea-99fb4e91ea8e-utilities\") pod \"124df3fc-94ac-4c49-8cea-99fb4e91ea8e\" (UID: \"124df3fc-94ac-4c49-8cea-99fb4e91ea8e\") " Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.154632 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124df3fc-94ac-4c49-8cea-99fb4e91ea8e-catalog-content\") pod \"124df3fc-94ac-4c49-8cea-99fb4e91ea8e\" (UID: \"124df3fc-94ac-4c49-8cea-99fb4e91ea8e\") " Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.155378 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/124df3fc-94ac-4c49-8cea-99fb4e91ea8e-utilities" (OuterVolumeSpecName: "utilities") pod "124df3fc-94ac-4c49-8cea-99fb4e91ea8e" (UID: "124df3fc-94ac-4c49-8cea-99fb4e91ea8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.160582 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/124df3fc-94ac-4c49-8cea-99fb4e91ea8e-kube-api-access-8d8t4" (OuterVolumeSpecName: "kube-api-access-8d8t4") pod "124df3fc-94ac-4c49-8cea-99fb4e91ea8e" (UID: "124df3fc-94ac-4c49-8cea-99fb4e91ea8e"). InnerVolumeSpecName "kube-api-access-8d8t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.204131 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/124df3fc-94ac-4c49-8cea-99fb4e91ea8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "124df3fc-94ac-4c49-8cea-99fb4e91ea8e" (UID: "124df3fc-94ac-4c49-8cea-99fb4e91ea8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.257102 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d8t4\" (UniqueName: \"kubernetes.io/projected/124df3fc-94ac-4c49-8cea-99fb4e91ea8e-kube-api-access-8d8t4\") on node \"crc\" DevicePath \"\"" Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.257138 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124df3fc-94ac-4c49-8cea-99fb4e91ea8e-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.257148 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124df3fc-94ac-4c49-8cea-99fb4e91ea8e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.597921 4757 generic.go:334] "Generic (PLEG): container finished" podID="124df3fc-94ac-4c49-8cea-99fb4e91ea8e" containerID="8a650212ed043e5e96968f2586dec7b7cf1083518e05dc5f38f947a297b5b0e7" exitCode=0 Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.597959 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjczl" event={"ID":"124df3fc-94ac-4c49-8cea-99fb4e91ea8e","Type":"ContainerDied","Data":"8a650212ed043e5e96968f2586dec7b7cf1083518e05dc5f38f947a297b5b0e7"} Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.597985 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjczl" event={"ID":"124df3fc-94ac-4c49-8cea-99fb4e91ea8e","Type":"ContainerDied","Data":"fc89818117c481b24dd14b4b48147c091b57f345acfdbfd021298e6bf2cd3e8f"} Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.598016 4757 scope.go:117] "RemoveContainer" containerID="8a650212ed043e5e96968f2586dec7b7cf1083518e05dc5f38f947a297b5b0e7" Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.598061 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjczl" Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.621820 4757 scope.go:117] "RemoveContainer" containerID="6c6bcac81b181b9fe3b4f79200ba93590b5096ad42b26f3234cbb4c353ba9af8" Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.649112 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gjczl"] Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.659791 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gjczl"] Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.685385 4757 scope.go:117] "RemoveContainer" containerID="6f4f9fd667d741eb469d5732e9a64b6ea8bb77c4cb6260bd22e3c7749f6b00c3" Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.715242 4757 scope.go:117] "RemoveContainer" containerID="8a650212ed043e5e96968f2586dec7b7cf1083518e05dc5f38f947a297b5b0e7" Dec 16 14:18:51 crc kubenswrapper[4757]: E1216 14:18:51.715630 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a650212ed043e5e96968f2586dec7b7cf1083518e05dc5f38f947a297b5b0e7\": container with ID starting with 8a650212ed043e5e96968f2586dec7b7cf1083518e05dc5f38f947a297b5b0e7 not found: ID does not exist" containerID="8a650212ed043e5e96968f2586dec7b7cf1083518e05dc5f38f947a297b5b0e7" Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.715694 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a650212ed043e5e96968f2586dec7b7cf1083518e05dc5f38f947a297b5b0e7"} err="failed to get container status \"8a650212ed043e5e96968f2586dec7b7cf1083518e05dc5f38f947a297b5b0e7\": rpc error: code = NotFound desc = could not find container \"8a650212ed043e5e96968f2586dec7b7cf1083518e05dc5f38f947a297b5b0e7\": container with ID starting with 8a650212ed043e5e96968f2586dec7b7cf1083518e05dc5f38f947a297b5b0e7 not found: ID does not exist" Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.715729 4757 scope.go:117] "RemoveContainer" containerID="6c6bcac81b181b9fe3b4f79200ba93590b5096ad42b26f3234cbb4c353ba9af8" Dec 16 14:18:51 crc kubenswrapper[4757]: E1216 14:18:51.716215 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c6bcac81b181b9fe3b4f79200ba93590b5096ad42b26f3234cbb4c353ba9af8\": container with ID starting with 6c6bcac81b181b9fe3b4f79200ba93590b5096ad42b26f3234cbb4c353ba9af8 not found: ID does not exist" containerID="6c6bcac81b181b9fe3b4f79200ba93590b5096ad42b26f3234cbb4c353ba9af8" Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.716246 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c6bcac81b181b9fe3b4f79200ba93590b5096ad42b26f3234cbb4c353ba9af8"} err="failed to get container status \"6c6bcac81b181b9fe3b4f79200ba93590b5096ad42b26f3234cbb4c353ba9af8\": rpc error: code = NotFound desc = could not find container \"6c6bcac81b181b9fe3b4f79200ba93590b5096ad42b26f3234cbb4c353ba9af8\": container with ID starting with 6c6bcac81b181b9fe3b4f79200ba93590b5096ad42b26f3234cbb4c353ba9af8 not found: ID does not exist" Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.716269 4757 scope.go:117] "RemoveContainer" containerID="6f4f9fd667d741eb469d5732e9a64b6ea8bb77c4cb6260bd22e3c7749f6b00c3" Dec 16 14:18:51 crc kubenswrapper[4757]: E1216 14:18:51.716559 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f4f9fd667d741eb469d5732e9a64b6ea8bb77c4cb6260bd22e3c7749f6b00c3\": container with ID starting with 6f4f9fd667d741eb469d5732e9a64b6ea8bb77c4cb6260bd22e3c7749f6b00c3 not found: ID does not exist" containerID="6f4f9fd667d741eb469d5732e9a64b6ea8bb77c4cb6260bd22e3c7749f6b00c3" Dec 16 14:18:51 crc kubenswrapper[4757]: I1216 14:18:51.716586 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f4f9fd667d741eb469d5732e9a64b6ea8bb77c4cb6260bd22e3c7749f6b00c3"} err="failed to get container status \"6f4f9fd667d741eb469d5732e9a64b6ea8bb77c4cb6260bd22e3c7749f6b00c3\": rpc error: code = NotFound desc = could not find container \"6f4f9fd667d741eb469d5732e9a64b6ea8bb77c4cb6260bd22e3c7749f6b00c3\": container with ID starting with 6f4f9fd667d741eb469d5732e9a64b6ea8bb77c4cb6260bd22e3c7749f6b00c3 not found: ID does not exist" Dec 16 14:18:52 crc kubenswrapper[4757]: I1216 14:18:52.965252 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="124df3fc-94ac-4c49-8cea-99fb4e91ea8e" path="/var/lib/kubelet/pods/124df3fc-94ac-4c49-8cea-99fb4e91ea8e/volumes" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.455721 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wlqnx/must-gather-qxbnt"] Dec 16 14:18:53 crc kubenswrapper[4757]: E1216 14:18:53.456317 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124df3fc-94ac-4c49-8cea-99fb4e91ea8e" containerName="registry-server" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.456333 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="124df3fc-94ac-4c49-8cea-99fb4e91ea8e" containerName="registry-server" Dec 16 14:18:53 crc kubenswrapper[4757]: E1216 14:18:53.456356 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f434b49e-1f3a-4330-82be-416a796fcec2" containerName="extract-content" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.456362 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="f434b49e-1f3a-4330-82be-416a796fcec2" containerName="extract-content" Dec 16 14:18:53 crc kubenswrapper[4757]: E1216 14:18:53.456378 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124df3fc-94ac-4c49-8cea-99fb4e91ea8e" containerName="extract-content" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.456383 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="124df3fc-94ac-4c49-8cea-99fb4e91ea8e" containerName="extract-content" Dec 16 14:18:53 crc kubenswrapper[4757]: E1216 14:18:53.456392 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f434b49e-1f3a-4330-82be-416a796fcec2" containerName="extract-utilities" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.456398 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="f434b49e-1f3a-4330-82be-416a796fcec2" containerName="extract-utilities" Dec 16 14:18:53 crc kubenswrapper[4757]: E1216 14:18:53.456413 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f434b49e-1f3a-4330-82be-416a796fcec2" containerName="registry-server" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.456418 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="f434b49e-1f3a-4330-82be-416a796fcec2" containerName="registry-server" Dec 16 14:18:53 crc kubenswrapper[4757]: E1216 14:18:53.456433 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124df3fc-94ac-4c49-8cea-99fb4e91ea8e" containerName="extract-utilities" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.456440 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="124df3fc-94ac-4c49-8cea-99fb4e91ea8e" containerName="extract-utilities" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.456586 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="124df3fc-94ac-4c49-8cea-99fb4e91ea8e" containerName="registry-server" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.456615 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="f434b49e-1f3a-4330-82be-416a796fcec2" containerName="registry-server" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.457551 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlqnx/must-gather-qxbnt" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.466909 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wlqnx"/"openshift-service-ca.crt" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.469695 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wlqnx/must-gather-qxbnt"] Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.476353 4757 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wlqnx"/"kube-root-ca.crt" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.606687 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbxss\" (UniqueName: \"kubernetes.io/projected/3803726d-b2f2-4424-9e53-0c9186d1b450-kube-api-access-rbxss\") pod \"must-gather-qxbnt\" (UID: \"3803726d-b2f2-4424-9e53-0c9186d1b450\") " pod="openshift-must-gather-wlqnx/must-gather-qxbnt" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.606823 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3803726d-b2f2-4424-9e53-0c9186d1b450-must-gather-output\") pod \"must-gather-qxbnt\" (UID: \"3803726d-b2f2-4424-9e53-0c9186d1b450\") " pod="openshift-must-gather-wlqnx/must-gather-qxbnt" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.708970 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbxss\" (UniqueName: \"kubernetes.io/projected/3803726d-b2f2-4424-9e53-0c9186d1b450-kube-api-access-rbxss\") pod \"must-gather-qxbnt\" (UID: \"3803726d-b2f2-4424-9e53-0c9186d1b450\") " pod="openshift-must-gather-wlqnx/must-gather-qxbnt" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.709225 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3803726d-b2f2-4424-9e53-0c9186d1b450-must-gather-output\") pod \"must-gather-qxbnt\" (UID: \"3803726d-b2f2-4424-9e53-0c9186d1b450\") " pod="openshift-must-gather-wlqnx/must-gather-qxbnt" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.709775 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3803726d-b2f2-4424-9e53-0c9186d1b450-must-gather-output\") pod \"must-gather-qxbnt\" (UID: \"3803726d-b2f2-4424-9e53-0c9186d1b450\") " pod="openshift-must-gather-wlqnx/must-gather-qxbnt" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.751584 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbxss\" (UniqueName: \"kubernetes.io/projected/3803726d-b2f2-4424-9e53-0c9186d1b450-kube-api-access-rbxss\") pod \"must-gather-qxbnt\" (UID: \"3803726d-b2f2-4424-9e53-0c9186d1b450\") " pod="openshift-must-gather-wlqnx/must-gather-qxbnt" Dec 16 14:18:53 crc kubenswrapper[4757]: I1216 14:18:53.775190 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlqnx/must-gather-qxbnt" Dec 16 14:18:54 crc kubenswrapper[4757]: I1216 14:18:54.301958 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wlqnx/must-gather-qxbnt"] Dec 16 14:18:54 crc kubenswrapper[4757]: I1216 14:18:54.630116 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlqnx/must-gather-qxbnt" event={"ID":"3803726d-b2f2-4424-9e53-0c9186d1b450","Type":"ContainerStarted","Data":"3ade8f9d97efe6903646eb5e774031cecf3107d6408b7479f6040d272af062bf"} Dec 16 14:18:54 crc kubenswrapper[4757]: I1216 14:18:54.630420 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlqnx/must-gather-qxbnt" event={"ID":"3803726d-b2f2-4424-9e53-0c9186d1b450","Type":"ContainerStarted","Data":"131a945304812ef959e4999a826fad6080b7cd06de756259450918885bb0fd45"} Dec 16 14:18:55 crc kubenswrapper[4757]: I1216 14:18:55.642154 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlqnx/must-gather-qxbnt" event={"ID":"3803726d-b2f2-4424-9e53-0c9186d1b450","Type":"ContainerStarted","Data":"6763c930076b4cfa30b02114c52753334df7500006b762b58710e8a7db9d36d8"} Dec 16 14:18:55 crc kubenswrapper[4757]: I1216 14:18:55.664296 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wlqnx/must-gather-qxbnt" podStartSLOduration=2.664276989 podStartE2EDuration="2.664276989s" podCreationTimestamp="2025-12-16 14:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:18:55.657756449 +0000 UTC m=+5521.085500235" watchObservedRunningTime="2025-12-16 14:18:55.664276989 +0000 UTC m=+5521.092020785" Dec 16 14:18:58 crc kubenswrapper[4757]: I1216 14:18:58.934710 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wlqnx/crc-debug-tl7cl"] Dec 16 14:18:58 crc kubenswrapper[4757]: I1216 14:18:58.936547 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlqnx/crc-debug-tl7cl" Dec 16 14:18:58 crc kubenswrapper[4757]: I1216 14:18:58.938919 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-wlqnx"/"default-dockercfg-ksk4g" Dec 16 14:18:59 crc kubenswrapper[4757]: I1216 14:18:59.015402 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe-host\") pod \"crc-debug-tl7cl\" (UID: \"d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe\") " pod="openshift-must-gather-wlqnx/crc-debug-tl7cl" Dec 16 14:18:59 crc kubenswrapper[4757]: I1216 14:18:59.015752 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp2gj\" (UniqueName: \"kubernetes.io/projected/d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe-kube-api-access-bp2gj\") pod \"crc-debug-tl7cl\" (UID: \"d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe\") " pod="openshift-must-gather-wlqnx/crc-debug-tl7cl" Dec 16 14:18:59 crc kubenswrapper[4757]: I1216 14:18:59.117026 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp2gj\" (UniqueName: \"kubernetes.io/projected/d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe-kube-api-access-bp2gj\") pod \"crc-debug-tl7cl\" (UID: \"d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe\") " pod="openshift-must-gather-wlqnx/crc-debug-tl7cl" Dec 16 14:18:59 crc kubenswrapper[4757]: I1216 14:18:59.117183 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe-host\") pod \"crc-debug-tl7cl\" (UID: \"d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe\") " pod="openshift-must-gather-wlqnx/crc-debug-tl7cl" Dec 16 14:18:59 crc kubenswrapper[4757]: I1216 14:18:59.117308 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe-host\") pod \"crc-debug-tl7cl\" (UID: \"d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe\") " pod="openshift-must-gather-wlqnx/crc-debug-tl7cl" Dec 16 14:18:59 crc kubenswrapper[4757]: I1216 14:18:59.139149 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp2gj\" (UniqueName: \"kubernetes.io/projected/d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe-kube-api-access-bp2gj\") pod \"crc-debug-tl7cl\" (UID: \"d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe\") " pod="openshift-must-gather-wlqnx/crc-debug-tl7cl" Dec 16 14:18:59 crc kubenswrapper[4757]: I1216 14:18:59.262493 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlqnx/crc-debug-tl7cl" Dec 16 14:18:59 crc kubenswrapper[4757]: I1216 14:18:59.689215 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlqnx/crc-debug-tl7cl" event={"ID":"d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe","Type":"ContainerStarted","Data":"8f6cca2bcb9278eb9a3b2d2d1313a6c48b2f4b9c17f76eccc1cfd69256c17eb7"} Dec 16 14:18:59 crc kubenswrapper[4757]: I1216 14:18:59.689462 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlqnx/crc-debug-tl7cl" event={"ID":"d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe","Type":"ContainerStarted","Data":"93576bfda35ad2f99603861ef76b24ec4ef40ce943954e516f48c7c544b29671"} Dec 16 14:18:59 crc kubenswrapper[4757]: I1216 14:18:59.718323 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wlqnx/crc-debug-tl7cl" podStartSLOduration=1.718306305 podStartE2EDuration="1.718306305s" podCreationTimestamp="2025-12-16 14:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:18:59.716193423 +0000 UTC m=+5525.143937249" watchObservedRunningTime="2025-12-16 14:18:59.718306305 +0000 UTC m=+5525.146050101" Dec 16 14:19:42 crc kubenswrapper[4757]: I1216 14:19:42.099428 4757 generic.go:334] "Generic (PLEG): container finished" podID="d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe" containerID="8f6cca2bcb9278eb9a3b2d2d1313a6c48b2f4b9c17f76eccc1cfd69256c17eb7" exitCode=0 Dec 16 14:19:42 crc kubenswrapper[4757]: I1216 14:19:42.099517 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlqnx/crc-debug-tl7cl" event={"ID":"d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe","Type":"ContainerDied","Data":"8f6cca2bcb9278eb9a3b2d2d1313a6c48b2f4b9c17f76eccc1cfd69256c17eb7"} Dec 16 14:19:43 crc kubenswrapper[4757]: I1216 14:19:43.197969 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlqnx/crc-debug-tl7cl" Dec 16 14:19:43 crc kubenswrapper[4757]: I1216 14:19:43.218689 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp2gj\" (UniqueName: \"kubernetes.io/projected/d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe-kube-api-access-bp2gj\") pod \"d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe\" (UID: \"d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe\") " Dec 16 14:19:43 crc kubenswrapper[4757]: I1216 14:19:43.218922 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe-host\") pod \"d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe\" (UID: \"d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe\") " Dec 16 14:19:43 crc kubenswrapper[4757]: I1216 14:19:43.219421 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe-host" (OuterVolumeSpecName: "host") pod "d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe" (UID: "d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 14:19:43 crc kubenswrapper[4757]: I1216 14:19:43.238790 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe-kube-api-access-bp2gj" (OuterVolumeSpecName: "kube-api-access-bp2gj") pod "d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe" (UID: "d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe"). InnerVolumeSpecName "kube-api-access-bp2gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:19:43 crc kubenswrapper[4757]: I1216 14:19:43.249059 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wlqnx/crc-debug-tl7cl"] Dec 16 14:19:43 crc kubenswrapper[4757]: I1216 14:19:43.275977 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wlqnx/crc-debug-tl7cl"] Dec 16 14:19:43 crc kubenswrapper[4757]: I1216 14:19:43.321074 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp2gj\" (UniqueName: \"kubernetes.io/projected/d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe-kube-api-access-bp2gj\") on node \"crc\" DevicePath \"\"" Dec 16 14:19:43 crc kubenswrapper[4757]: I1216 14:19:43.321340 4757 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe-host\") on node \"crc\" DevicePath \"\"" Dec 16 14:19:44 crc kubenswrapper[4757]: I1216 14:19:44.118291 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93576bfda35ad2f99603861ef76b24ec4ef40ce943954e516f48c7c544b29671" Dec 16 14:19:44 crc kubenswrapper[4757]: I1216 14:19:44.118349 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlqnx/crc-debug-tl7cl" Dec 16 14:19:44 crc kubenswrapper[4757]: E1216 14:19:44.275410 4757 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9228eb3_cc03_41f7_9cf3_1aeb272c1ebe.slice/crio-93576bfda35ad2f99603861ef76b24ec4ef40ce943954e516f48c7c544b29671\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9228eb3_cc03_41f7_9cf3_1aeb272c1ebe.slice\": RecentStats: unable to find data in memory cache]" Dec 16 14:19:44 crc kubenswrapper[4757]: I1216 14:19:44.506102 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wlqnx/crc-debug-lf8fp"] Dec 16 14:19:44 crc kubenswrapper[4757]: E1216 14:19:44.506895 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe" containerName="container-00" Dec 16 14:19:44 crc kubenswrapper[4757]: I1216 14:19:44.507000 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe" containerName="container-00" Dec 16 14:19:44 crc kubenswrapper[4757]: I1216 14:19:44.507362 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe" containerName="container-00" Dec 16 14:19:44 crc kubenswrapper[4757]: I1216 14:19:44.508252 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlqnx/crc-debug-lf8fp" Dec 16 14:19:44 crc kubenswrapper[4757]: I1216 14:19:44.510719 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-wlqnx"/"default-dockercfg-ksk4g" Dec 16 14:19:44 crc kubenswrapper[4757]: I1216 14:19:44.544652 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk7rh\" (UniqueName: \"kubernetes.io/projected/7758a50c-8614-4262-bcf5-fabc0008d3e5-kube-api-access-gk7rh\") pod \"crc-debug-lf8fp\" (UID: \"7758a50c-8614-4262-bcf5-fabc0008d3e5\") " pod="openshift-must-gather-wlqnx/crc-debug-lf8fp" Dec 16 14:19:44 crc kubenswrapper[4757]: I1216 14:19:44.544714 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7758a50c-8614-4262-bcf5-fabc0008d3e5-host\") pod \"crc-debug-lf8fp\" (UID: \"7758a50c-8614-4262-bcf5-fabc0008d3e5\") " pod="openshift-must-gather-wlqnx/crc-debug-lf8fp" Dec 16 14:19:44 crc kubenswrapper[4757]: I1216 14:19:44.646649 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7758a50c-8614-4262-bcf5-fabc0008d3e5-host\") pod \"crc-debug-lf8fp\" (UID: \"7758a50c-8614-4262-bcf5-fabc0008d3e5\") " pod="openshift-must-gather-wlqnx/crc-debug-lf8fp" Dec 16 14:19:44 crc kubenswrapper[4757]: I1216 14:19:44.646854 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk7rh\" (UniqueName: \"kubernetes.io/projected/7758a50c-8614-4262-bcf5-fabc0008d3e5-kube-api-access-gk7rh\") pod \"crc-debug-lf8fp\" (UID: \"7758a50c-8614-4262-bcf5-fabc0008d3e5\") " pod="openshift-must-gather-wlqnx/crc-debug-lf8fp" Dec 16 14:19:44 crc kubenswrapper[4757]: I1216 14:19:44.647287 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7758a50c-8614-4262-bcf5-fabc0008d3e5-host\") pod \"crc-debug-lf8fp\" (UID: \"7758a50c-8614-4262-bcf5-fabc0008d3e5\") " pod="openshift-must-gather-wlqnx/crc-debug-lf8fp" Dec 16 14:19:44 crc kubenswrapper[4757]: I1216 14:19:44.665825 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk7rh\" (UniqueName: \"kubernetes.io/projected/7758a50c-8614-4262-bcf5-fabc0008d3e5-kube-api-access-gk7rh\") pod \"crc-debug-lf8fp\" (UID: \"7758a50c-8614-4262-bcf5-fabc0008d3e5\") " pod="openshift-must-gather-wlqnx/crc-debug-lf8fp" Dec 16 14:19:44 crc kubenswrapper[4757]: I1216 14:19:44.824493 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlqnx/crc-debug-lf8fp" Dec 16 14:19:44 crc kubenswrapper[4757]: I1216 14:19:44.976100 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe" path="/var/lib/kubelet/pods/d9228eb3-cc03-41f7-9cf3-1aeb272c1ebe/volumes" Dec 16 14:19:45 crc kubenswrapper[4757]: I1216 14:19:45.126693 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlqnx/crc-debug-lf8fp" event={"ID":"7758a50c-8614-4262-bcf5-fabc0008d3e5","Type":"ContainerStarted","Data":"c0e9f172165a8649673865958ba5c0382f6e8eb4cb73e56a83d183a8bb6c5964"} Dec 16 14:19:45 crc kubenswrapper[4757]: I1216 14:19:45.127057 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlqnx/crc-debug-lf8fp" event={"ID":"7758a50c-8614-4262-bcf5-fabc0008d3e5","Type":"ContainerStarted","Data":"e97bd225f90ea9f7e0c2c2abf957a76a74a6d533d5e57fefa216ad231ebe3299"} Dec 16 14:19:45 crc kubenswrapper[4757]: I1216 14:19:45.140315 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wlqnx/crc-debug-lf8fp" podStartSLOduration=1.140298619 podStartE2EDuration="1.140298619s" podCreationTimestamp="2025-12-16 14:19:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:19:45.137159792 +0000 UTC m=+5570.564903588" watchObservedRunningTime="2025-12-16 14:19:45.140298619 +0000 UTC m=+5570.568042415" Dec 16 14:19:46 crc kubenswrapper[4757]: I1216 14:19:46.140223 4757 generic.go:334] "Generic (PLEG): container finished" podID="7758a50c-8614-4262-bcf5-fabc0008d3e5" containerID="c0e9f172165a8649673865958ba5c0382f6e8eb4cb73e56a83d183a8bb6c5964" exitCode=0 Dec 16 14:19:46 crc kubenswrapper[4757]: I1216 14:19:46.140265 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlqnx/crc-debug-lf8fp" event={"ID":"7758a50c-8614-4262-bcf5-fabc0008d3e5","Type":"ContainerDied","Data":"c0e9f172165a8649673865958ba5c0382f6e8eb4cb73e56a83d183a8bb6c5964"} Dec 16 14:19:47 crc kubenswrapper[4757]: I1216 14:19:47.246854 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlqnx/crc-debug-lf8fp" Dec 16 14:19:47 crc kubenswrapper[4757]: I1216 14:19:47.282756 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wlqnx/crc-debug-lf8fp"] Dec 16 14:19:47 crc kubenswrapper[4757]: I1216 14:19:47.290321 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wlqnx/crc-debug-lf8fp"] Dec 16 14:19:47 crc kubenswrapper[4757]: I1216 14:19:47.410409 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk7rh\" (UniqueName: \"kubernetes.io/projected/7758a50c-8614-4262-bcf5-fabc0008d3e5-kube-api-access-gk7rh\") pod \"7758a50c-8614-4262-bcf5-fabc0008d3e5\" (UID: \"7758a50c-8614-4262-bcf5-fabc0008d3e5\") " Dec 16 14:19:47 crc kubenswrapper[4757]: I1216 14:19:47.410476 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7758a50c-8614-4262-bcf5-fabc0008d3e5-host\") pod \"7758a50c-8614-4262-bcf5-fabc0008d3e5\" (UID: \"7758a50c-8614-4262-bcf5-fabc0008d3e5\") " Dec 16 14:19:47 crc kubenswrapper[4757]: I1216 14:19:47.410589 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7758a50c-8614-4262-bcf5-fabc0008d3e5-host" (OuterVolumeSpecName: "host") pod "7758a50c-8614-4262-bcf5-fabc0008d3e5" (UID: "7758a50c-8614-4262-bcf5-fabc0008d3e5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 14:19:47 crc kubenswrapper[4757]: I1216 14:19:47.410846 4757 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7758a50c-8614-4262-bcf5-fabc0008d3e5-host\") on node \"crc\" DevicePath \"\"" Dec 16 14:19:47 crc kubenswrapper[4757]: I1216 14:19:47.422323 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7758a50c-8614-4262-bcf5-fabc0008d3e5-kube-api-access-gk7rh" (OuterVolumeSpecName: "kube-api-access-gk7rh") pod "7758a50c-8614-4262-bcf5-fabc0008d3e5" (UID: "7758a50c-8614-4262-bcf5-fabc0008d3e5"). InnerVolumeSpecName "kube-api-access-gk7rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:19:47 crc kubenswrapper[4757]: I1216 14:19:47.512128 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk7rh\" (UniqueName: \"kubernetes.io/projected/7758a50c-8614-4262-bcf5-fabc0008d3e5-kube-api-access-gk7rh\") on node \"crc\" DevicePath \"\"" Dec 16 14:19:48 crc kubenswrapper[4757]: I1216 14:19:48.156434 4757 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e97bd225f90ea9f7e0c2c2abf957a76a74a6d533d5e57fefa216ad231ebe3299" Dec 16 14:19:48 crc kubenswrapper[4757]: I1216 14:19:48.156524 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlqnx/crc-debug-lf8fp" Dec 16 14:19:48 crc kubenswrapper[4757]: I1216 14:19:48.507964 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wlqnx/crc-debug-mgbxs"] Dec 16 14:19:48 crc kubenswrapper[4757]: E1216 14:19:48.508446 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7758a50c-8614-4262-bcf5-fabc0008d3e5" containerName="container-00" Dec 16 14:19:48 crc kubenswrapper[4757]: I1216 14:19:48.508462 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="7758a50c-8614-4262-bcf5-fabc0008d3e5" containerName="container-00" Dec 16 14:19:48 crc kubenswrapper[4757]: I1216 14:19:48.508695 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="7758a50c-8614-4262-bcf5-fabc0008d3e5" containerName="container-00" Dec 16 14:19:48 crc kubenswrapper[4757]: I1216 14:19:48.513667 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlqnx/crc-debug-mgbxs" Dec 16 14:19:48 crc kubenswrapper[4757]: I1216 14:19:48.515909 4757 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-wlqnx"/"default-dockercfg-ksk4g" Dec 16 14:19:48 crc kubenswrapper[4757]: I1216 14:19:48.556647 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4497265-5045-4886-93c8-55b10c379d7f-host\") pod \"crc-debug-mgbxs\" (UID: \"b4497265-5045-4886-93c8-55b10c379d7f\") " pod="openshift-must-gather-wlqnx/crc-debug-mgbxs" Dec 16 14:19:48 crc kubenswrapper[4757]: I1216 14:19:48.556735 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2rmb\" (UniqueName: \"kubernetes.io/projected/b4497265-5045-4886-93c8-55b10c379d7f-kube-api-access-q2rmb\") pod \"crc-debug-mgbxs\" (UID: \"b4497265-5045-4886-93c8-55b10c379d7f\") " pod="openshift-must-gather-wlqnx/crc-debug-mgbxs" Dec 16 14:19:48 crc kubenswrapper[4757]: I1216 14:19:48.658609 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2rmb\" (UniqueName: \"kubernetes.io/projected/b4497265-5045-4886-93c8-55b10c379d7f-kube-api-access-q2rmb\") pod \"crc-debug-mgbxs\" (UID: \"b4497265-5045-4886-93c8-55b10c379d7f\") " pod="openshift-must-gather-wlqnx/crc-debug-mgbxs" Dec 16 14:19:48 crc kubenswrapper[4757]: I1216 14:19:48.658805 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4497265-5045-4886-93c8-55b10c379d7f-host\") pod \"crc-debug-mgbxs\" (UID: \"b4497265-5045-4886-93c8-55b10c379d7f\") " pod="openshift-must-gather-wlqnx/crc-debug-mgbxs" Dec 16 14:19:48 crc kubenswrapper[4757]: I1216 14:19:48.658970 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4497265-5045-4886-93c8-55b10c379d7f-host\") pod \"crc-debug-mgbxs\" (UID: \"b4497265-5045-4886-93c8-55b10c379d7f\") " pod="openshift-must-gather-wlqnx/crc-debug-mgbxs" Dec 16 14:19:48 crc kubenswrapper[4757]: I1216 14:19:48.676504 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2rmb\" (UniqueName: \"kubernetes.io/projected/b4497265-5045-4886-93c8-55b10c379d7f-kube-api-access-q2rmb\") pod \"crc-debug-mgbxs\" (UID: \"b4497265-5045-4886-93c8-55b10c379d7f\") " pod="openshift-must-gather-wlqnx/crc-debug-mgbxs" Dec 16 14:19:48 crc kubenswrapper[4757]: I1216 14:19:48.828835 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlqnx/crc-debug-mgbxs" Dec 16 14:19:48 crc kubenswrapper[4757]: I1216 14:19:48.960536 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7758a50c-8614-4262-bcf5-fabc0008d3e5" path="/var/lib/kubelet/pods/7758a50c-8614-4262-bcf5-fabc0008d3e5/volumes" Dec 16 14:19:49 crc kubenswrapper[4757]: I1216 14:19:49.166404 4757 generic.go:334] "Generic (PLEG): container finished" podID="b4497265-5045-4886-93c8-55b10c379d7f" containerID="4116e09fae08dbbbf8942c59a07b642fefc33529c23ee2665dc72bde9402f74e" exitCode=0 Dec 16 14:19:49 crc kubenswrapper[4757]: I1216 14:19:49.166445 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlqnx/crc-debug-mgbxs" event={"ID":"b4497265-5045-4886-93c8-55b10c379d7f","Type":"ContainerDied","Data":"4116e09fae08dbbbf8942c59a07b642fefc33529c23ee2665dc72bde9402f74e"} Dec 16 14:19:49 crc kubenswrapper[4757]: I1216 14:19:49.166474 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlqnx/crc-debug-mgbxs" event={"ID":"b4497265-5045-4886-93c8-55b10c379d7f","Type":"ContainerStarted","Data":"6f5928a9f88df540a3ed8bebf4a0aa6cd1cfba3f2817c6c3293118e16210112d"} Dec 16 14:19:49 crc kubenswrapper[4757]: I1216 14:19:49.204181 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wlqnx/crc-debug-mgbxs"] Dec 16 14:19:49 crc kubenswrapper[4757]: I1216 14:19:49.220633 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wlqnx/crc-debug-mgbxs"] Dec 16 14:19:50 crc kubenswrapper[4757]: I1216 14:19:50.278028 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlqnx/crc-debug-mgbxs" Dec 16 14:19:50 crc kubenswrapper[4757]: I1216 14:19:50.401108 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2rmb\" (UniqueName: \"kubernetes.io/projected/b4497265-5045-4886-93c8-55b10c379d7f-kube-api-access-q2rmb\") pod \"b4497265-5045-4886-93c8-55b10c379d7f\" (UID: \"b4497265-5045-4886-93c8-55b10c379d7f\") " Dec 16 14:19:50 crc kubenswrapper[4757]: I1216 14:19:50.401297 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4497265-5045-4886-93c8-55b10c379d7f-host\") pod \"b4497265-5045-4886-93c8-55b10c379d7f\" (UID: \"b4497265-5045-4886-93c8-55b10c379d7f\") " Dec 16 14:19:50 crc kubenswrapper[4757]: I1216 14:19:50.401612 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4497265-5045-4886-93c8-55b10c379d7f-host" (OuterVolumeSpecName: "host") pod "b4497265-5045-4886-93c8-55b10c379d7f" (UID: "b4497265-5045-4886-93c8-55b10c379d7f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 14:19:50 crc kubenswrapper[4757]: I1216 14:19:50.401905 4757 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4497265-5045-4886-93c8-55b10c379d7f-host\") on node \"crc\" DevicePath \"\"" Dec 16 14:19:50 crc kubenswrapper[4757]: I1216 14:19:50.410466 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4497265-5045-4886-93c8-55b10c379d7f-kube-api-access-q2rmb" (OuterVolumeSpecName: "kube-api-access-q2rmb") pod "b4497265-5045-4886-93c8-55b10c379d7f" (UID: "b4497265-5045-4886-93c8-55b10c379d7f"). InnerVolumeSpecName "kube-api-access-q2rmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:19:50 crc kubenswrapper[4757]: I1216 14:19:50.503285 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2rmb\" (UniqueName: \"kubernetes.io/projected/b4497265-5045-4886-93c8-55b10c379d7f-kube-api-access-q2rmb\") on node \"crc\" DevicePath \"\"" Dec 16 14:19:50 crc kubenswrapper[4757]: I1216 14:19:50.963148 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4497265-5045-4886-93c8-55b10c379d7f" path="/var/lib/kubelet/pods/b4497265-5045-4886-93c8-55b10c379d7f/volumes" Dec 16 14:19:51 crc kubenswrapper[4757]: I1216 14:19:51.182435 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 14:19:51 crc kubenswrapper[4757]: I1216 14:19:51.182495 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 14:19:51 crc kubenswrapper[4757]: I1216 14:19:51.185131 4757 scope.go:117] "RemoveContainer" containerID="4116e09fae08dbbbf8942c59a07b642fefc33529c23ee2665dc72bde9402f74e" Dec 16 14:19:51 crc kubenswrapper[4757]: I1216 14:19:51.185221 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlqnx/crc-debug-mgbxs" Dec 16 14:20:15 crc kubenswrapper[4757]: I1216 14:20:15.976340 4757 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-544dfc5bc-8q666" podUID="38a8b3dc-7995-4851-96db-0fb6749669b9" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 16 14:20:21 crc kubenswrapper[4757]: I1216 14:20:21.180993 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 14:20:21 crc kubenswrapper[4757]: I1216 14:20:21.181467 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 14:20:26 crc kubenswrapper[4757]: I1216 14:20:26.762072 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cfff6bfd-qz5sk_fd29da8f-05a6-43a9-a943-c6a8a4ef8479/barbican-api/0.log" Dec 16 14:20:26 crc kubenswrapper[4757]: I1216 14:20:26.806455 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cfff6bfd-qz5sk_fd29da8f-05a6-43a9-a943-c6a8a4ef8479/barbican-api-log/0.log" Dec 16 14:20:26 crc kubenswrapper[4757]: I1216 14:20:26.936299 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d9fbfd57d-79dn2_25963bc5-afd1-4703-a583-df0d8094117d/barbican-keystone-listener/0.log" Dec 16 14:20:27 crc kubenswrapper[4757]: I1216 14:20:27.064120 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d9fbfd57d-79dn2_25963bc5-afd1-4703-a583-df0d8094117d/barbican-keystone-listener-log/0.log" Dec 16 14:20:27 crc kubenswrapper[4757]: I1216 14:20:27.180975 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-9bd99879-lw2rw_7d1df7bf-6c39-4e49-873a-701b8c05f900/barbican-worker/0.log" Dec 16 14:20:27 crc kubenswrapper[4757]: I1216 14:20:27.299476 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-9bd99879-lw2rw_7d1df7bf-6c39-4e49-873a-701b8c05f900/barbican-worker-log/0.log" Dec 16 14:20:27 crc kubenswrapper[4757]: I1216 14:20:27.383258 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-rhszg_94b4d3d7-3488-45fa-bbeb-894a4bb55ca1/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:20:27 crc kubenswrapper[4757]: I1216 14:20:27.553509 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65a6f73f-1823-407a-a9f2-c693e5ddcca9/ceilometer-central-agent/1.log" Dec 16 14:20:27 crc kubenswrapper[4757]: I1216 14:20:27.644168 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65a6f73f-1823-407a-a9f2-c693e5ddcca9/ceilometer-central-agent/0.log" Dec 16 14:20:27 crc kubenswrapper[4757]: I1216 14:20:27.713059 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65a6f73f-1823-407a-a9f2-c693e5ddcca9/ceilometer-notification-agent/0.log" Dec 16 14:20:27 crc kubenswrapper[4757]: I1216 14:20:27.719207 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65a6f73f-1823-407a-a9f2-c693e5ddcca9/ceilometer-notification-agent/1.log" Dec 16 14:20:27 crc kubenswrapper[4757]: I1216 14:20:27.806133 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65a6f73f-1823-407a-a9f2-c693e5ddcca9/proxy-httpd/0.log" Dec 16 14:20:27 crc kubenswrapper[4757]: I1216 14:20:27.912937 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65a6f73f-1823-407a-a9f2-c693e5ddcca9/sg-core/0.log" Dec 16 14:20:28 crc kubenswrapper[4757]: I1216 14:20:28.118048 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e2b303e0-e076-4589-9fb3-b51f998a293e/cinder-api-log/0.log" Dec 16 14:20:28 crc kubenswrapper[4757]: I1216 14:20:28.134703 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e2b303e0-e076-4589-9fb3-b51f998a293e/cinder-api/0.log" Dec 16 14:20:28 crc kubenswrapper[4757]: I1216 14:20:28.271841 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9065cfba-e560-471d-bb64-e20502e5b5d6/cinder-scheduler/0.log" Dec 16 14:20:28 crc kubenswrapper[4757]: I1216 14:20:28.446891 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9065cfba-e560-471d-bb64-e20502e5b5d6/probe/0.log" Dec 16 14:20:28 crc kubenswrapper[4757]: I1216 14:20:28.529810 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-kth25_0b04c40b-fcee-4a0c-b5d0-c994f3fd138e/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:20:28 crc kubenswrapper[4757]: I1216 14:20:28.720618 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-fc752_ec2b71fe-44a0-4fae-b631-f719f7d735a5/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:20:28 crc kubenswrapper[4757]: I1216 14:20:28.757075 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6d47554775-kw5w5_dae9e574-826f-4521-8b35-5c836c1cde3b/init/0.log" Dec 16 14:20:29 crc kubenswrapper[4757]: I1216 14:20:29.159779 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6d47554775-kw5w5_dae9e574-826f-4521-8b35-5c836c1cde3b/init/0.log" Dec 16 14:20:29 crc kubenswrapper[4757]: I1216 14:20:29.306347 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mzqjm_74f0d526-ef23-47fd-b475-6f799fd57ba5/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:20:29 crc kubenswrapper[4757]: I1216 14:20:29.469314 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6d47554775-kw5w5_dae9e574-826f-4521-8b35-5c836c1cde3b/dnsmasq-dns/0.log" Dec 16 14:20:29 crc kubenswrapper[4757]: I1216 14:20:29.662742 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_40de399c-634b-4d44-a9ca-0aec62a9088b/glance-httpd/0.log" Dec 16 14:20:29 crc kubenswrapper[4757]: I1216 14:20:29.705289 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_40de399c-634b-4d44-a9ca-0aec62a9088b/glance-log/0.log" Dec 16 14:20:29 crc kubenswrapper[4757]: I1216 14:20:29.844862 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4e5b2048-f283-4bad-a57a-ae09865c33f2/glance-httpd/0.log" Dec 16 14:20:29 crc kubenswrapper[4757]: I1216 14:20:29.987402 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4e5b2048-f283-4bad-a57a-ae09865c33f2/glance-log/0.log" Dec 16 14:20:30 crc kubenswrapper[4757]: I1216 14:20:30.218874 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5d66ddf65b-lmltr_65337bd1-c674-4817-91c2-ad150639205c/horizon/1.log" Dec 16 14:20:30 crc kubenswrapper[4757]: I1216 14:20:30.285237 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5d66ddf65b-lmltr_65337bd1-c674-4817-91c2-ad150639205c/horizon/2.log" Dec 16 14:20:30 crc kubenswrapper[4757]: I1216 14:20:30.591421 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-r9wzt_0b943ec1-dc21-47ab-832a-d6f68f3ac17f/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:20:30 crc kubenswrapper[4757]: I1216 14:20:30.788501 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-6fz7k_cd87efc3-653f-4794-89b8-490ea0b504dd/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:20:30 crc kubenswrapper[4757]: I1216 14:20:30.823253 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5d66ddf65b-lmltr_65337bd1-c674-4817-91c2-ad150639205c/horizon-log/0.log" Dec 16 14:20:31 crc kubenswrapper[4757]: I1216 14:20:31.182640 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29431561-kfrzf_82cd88c0-672b-4d50-ae86-edeae2da08a1/keystone-cron/0.log" Dec 16 14:20:31 crc kubenswrapper[4757]: I1216 14:20:31.387607 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-66866d5f44-2mhtb_eb176388-d71c-4d06-986d-f62cb0d86fe3/keystone-api/0.log" Dec 16 14:20:31 crc kubenswrapper[4757]: I1216 14:20:31.497858 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_2457fa41-c003-450d-a55e-f67c36155f94/kube-state-metrics/0.log" Dec 16 14:20:31 crc kubenswrapper[4757]: I1216 14:20:31.760636 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-pq8xp_d146c06e-d73a-47a2-8e1f-07ca485b1a72/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:20:32 crc kubenswrapper[4757]: I1216 14:20:32.271692 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-khjhl_ad61ff87-21a4-4583-83b4-65c2253f2993/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:20:32 crc kubenswrapper[4757]: I1216 14:20:32.492017 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7f64c6bbf7-pnthz_cac5be05-fb05-4246-86e5-2b8dbdbffd04/neutron-httpd/0.log" Dec 16 14:20:32 crc kubenswrapper[4757]: I1216 14:20:32.589782 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7f64c6bbf7-pnthz_cac5be05-fb05-4246-86e5-2b8dbdbffd04/neutron-api/0.log" Dec 16 14:20:33 crc kubenswrapper[4757]: I1216 14:20:33.535201 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_614c552b-9e07-4f84-becd-3dfa75851309/nova-cell0-conductor-conductor/0.log" Dec 16 14:20:34 crc kubenswrapper[4757]: I1216 14:20:34.050853 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f7f0c537-530b-4e4a-ae96-35ba695d26be/nova-cell1-conductor-conductor/0.log" Dec 16 14:20:34 crc kubenswrapper[4757]: I1216 14:20:34.310451 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9b61a66c-fcc6-4cc3-acf4-1fda2f1506cd/nova-cell1-novncproxy-novncproxy/0.log" Dec 16 14:20:34 crc kubenswrapper[4757]: I1216 14:20:34.390674 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6aaa83ef-e285-41a7-93c0-853ecd275115/nova-api-log/0.log" Dec 16 14:20:34 crc kubenswrapper[4757]: I1216 14:20:34.660244 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6aaa83ef-e285-41a7-93c0-853ecd275115/nova-api-api/0.log" Dec 16 14:20:34 crc kubenswrapper[4757]: I1216 14:20:34.769461 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877/nova-metadata-log/0.log" Dec 16 14:20:34 crc kubenswrapper[4757]: I1216 14:20:34.772271 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-8dcjd_13b0d4c7-5eab-400a-9513-9391342fffee/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:20:35 crc kubenswrapper[4757]: I1216 14:20:35.203619 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_baba14f2-35db-422f-a583-724854b001d1/mysql-bootstrap/0.log" Dec 16 14:20:35 crc kubenswrapper[4757]: I1216 14:20:35.417550 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_baba14f2-35db-422f-a583-724854b001d1/mysql-bootstrap/0.log" Dec 16 14:20:35 crc kubenswrapper[4757]: I1216 14:20:35.461936 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_baba14f2-35db-422f-a583-724854b001d1/galera/0.log" Dec 16 14:20:35 crc kubenswrapper[4757]: I1216 14:20:35.643851 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_fc85d441-d05f-4495-a380-1a5ed58ad631/nova-scheduler-scheduler/0.log" Dec 16 14:20:35 crc kubenswrapper[4757]: I1216 14:20:35.802132 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c2d16196-ea98-44e5-b859-bea9a8392c01/mysql-bootstrap/0.log" Dec 16 14:20:36 crc kubenswrapper[4757]: I1216 14:20:36.006591 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c2d16196-ea98-44e5-b859-bea9a8392c01/mysql-bootstrap/0.log" Dec 16 14:20:36 crc kubenswrapper[4757]: I1216 14:20:36.116238 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c2d16196-ea98-44e5-b859-bea9a8392c01/galera/0.log" Dec 16 14:20:36 crc kubenswrapper[4757]: I1216 14:20:36.270998 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_891886a7-6bbd-48b7-8460-a1467bae862a/openstackclient/0.log" Dec 16 14:20:36 crc kubenswrapper[4757]: I1216 14:20:36.430021 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-d654b_e270a555-95f7-466f-8bb6-e76836a33d68/openstack-network-exporter/0.log" Dec 16 14:20:37 crc kubenswrapper[4757]: I1216 14:20:37.012084 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2mmpx_86aeade4-6bed-4d48-ab21-c43ac5b8c06b/ovsdb-server-init/0.log" Dec 16 14:20:37 crc kubenswrapper[4757]: I1216 14:20:37.128616 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0f3c1eb6-5bc6-4b1e-a9e2-50972fe20877/nova-metadata-metadata/0.log" Dec 16 14:20:37 crc kubenswrapper[4757]: I1216 14:20:37.313306 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2mmpx_86aeade4-6bed-4d48-ab21-c43ac5b8c06b/ovsdb-server-init/0.log" Dec 16 14:20:37 crc kubenswrapper[4757]: I1216 14:20:37.357762 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2mmpx_86aeade4-6bed-4d48-ab21-c43ac5b8c06b/ovs-vswitchd/0.log" Dec 16 14:20:37 crc kubenswrapper[4757]: I1216 14:20:37.403649 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2mmpx_86aeade4-6bed-4d48-ab21-c43ac5b8c06b/ovsdb-server/0.log" Dec 16 14:20:37 crc kubenswrapper[4757]: I1216 14:20:37.601502 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-xjblp_824c8db6-764f-4062-85c5-3c0fcbe434ce/ovn-controller/0.log" Dec 16 14:20:37 crc kubenswrapper[4757]: I1216 14:20:37.735464 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-jj4t6_85e4dfc5-8085-4270-847a-a36c8194b383/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:20:37 crc kubenswrapper[4757]: I1216 14:20:37.947375 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_16cd2aac-d1cc-4f30-8b86-8fd811f20f88/openstack-network-exporter/0.log" Dec 16 14:20:38 crc kubenswrapper[4757]: I1216 14:20:38.020977 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f3e06047-7a9b-46ce-9021-a88b62993e3d/memcached/0.log" Dec 16 14:20:38 crc kubenswrapper[4757]: I1216 14:20:38.122938 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_16cd2aac-d1cc-4f30-8b86-8fd811f20f88/ovn-northd/0.log" Dec 16 14:20:38 crc kubenswrapper[4757]: I1216 14:20:38.158985 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_89cc68a0-15fd-4a20-bd71-9c8acb5a92c7/openstack-network-exporter/0.log" Dec 16 14:20:38 crc kubenswrapper[4757]: I1216 14:20:38.212792 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_89cc68a0-15fd-4a20-bd71-9c8acb5a92c7/ovsdbserver-nb/0.log" Dec 16 14:20:38 crc kubenswrapper[4757]: I1216 14:20:38.402767 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_972a26d6-4f3b-4fc4-8e86-055dfe33652a/openstack-network-exporter/0.log" Dec 16 14:20:38 crc kubenswrapper[4757]: I1216 14:20:38.452118 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_972a26d6-4f3b-4fc4-8e86-055dfe33652a/ovsdbserver-sb/0.log" Dec 16 14:20:38 crc kubenswrapper[4757]: I1216 14:20:38.569704 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5494d9c5f6-8dwpv_e8ba167f-6c35-410d-b690-1083c5a482ae/placement-api/0.log" Dec 16 14:20:38 crc kubenswrapper[4757]: I1216 14:20:38.787704 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_935a64f5-e332-4c06-b4df-f93ec46b7b35/setup-container/0.log" Dec 16 14:20:38 crc kubenswrapper[4757]: I1216 14:20:38.812236 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5494d9c5f6-8dwpv_e8ba167f-6c35-410d-b690-1083c5a482ae/placement-log/0.log" Dec 16 14:20:39 crc kubenswrapper[4757]: I1216 14:20:39.005509 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_935a64f5-e332-4c06-b4df-f93ec46b7b35/rabbitmq/0.log" Dec 16 14:20:39 crc kubenswrapper[4757]: I1216 14:20:39.018237 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_935a64f5-e332-4c06-b4df-f93ec46b7b35/setup-container/0.log" Dec 16 14:20:39 crc kubenswrapper[4757]: I1216 14:20:39.071607 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_268a1573-c10e-42ca-9776-222ed2186693/setup-container/0.log" Dec 16 14:20:39 crc kubenswrapper[4757]: I1216 14:20:39.291568 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_268a1573-c10e-42ca-9776-222ed2186693/setup-container/0.log" Dec 16 14:20:39 crc kubenswrapper[4757]: I1216 14:20:39.295980 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_268a1573-c10e-42ca-9776-222ed2186693/rabbitmq/0.log" Dec 16 14:20:39 crc kubenswrapper[4757]: I1216 14:20:39.320702 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-ncpk9_f4c2d838-cc46-4457-9b88-5ea6eb7f14e4/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:20:39 crc kubenswrapper[4757]: I1216 14:20:39.514032 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-spb5p_95440627-f74c-45d0-a168-e8c37e8e7122/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:20:39 crc kubenswrapper[4757]: I1216 14:20:39.570670 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-77555_2c6850c9-0076-4df2-92e7-14521aa14305/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:20:39 crc kubenswrapper[4757]: I1216 14:20:39.613038 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-kdgqh_af434566-0202-4f31-a55c-440b7ae410e6/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:20:39 crc kubenswrapper[4757]: I1216 14:20:39.831571 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-xnv2c_228eaf09-5e24-45ad-b3f8-d0ac2d5c8f5c/ssh-known-hosts-edpm-deployment/0.log" Dec 16 14:20:40 crc kubenswrapper[4757]: I1216 14:20:40.011518 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-544dfc5bc-8q666_38a8b3dc-7995-4851-96db-0fb6749669b9/proxy-server/0.log" Dec 16 14:20:40 crc kubenswrapper[4757]: I1216 14:20:40.041571 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-544dfc5bc-8q666_38a8b3dc-7995-4851-96db-0fb6749669b9/proxy-httpd/0.log" Dec 16 14:20:40 crc kubenswrapper[4757]: I1216 14:20:40.161933 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-j69vw_ff595563-ea6e-4337-8018-275c60afebfb/swift-ring-rebalance/0.log" Dec 16 14:20:40 crc kubenswrapper[4757]: I1216 14:20:40.249091 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/account-auditor/0.log" Dec 16 14:20:40 crc kubenswrapper[4757]: I1216 14:20:40.624082 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/account-reaper/0.log" Dec 16 14:20:40 crc kubenswrapper[4757]: I1216 14:20:40.782518 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/account-server/0.log" Dec 16 14:20:40 crc kubenswrapper[4757]: I1216 14:20:40.793855 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/account-replicator/0.log" Dec 16 14:20:40 crc kubenswrapper[4757]: I1216 14:20:40.795794 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/container-auditor/0.log" Dec 16 14:20:40 crc kubenswrapper[4757]: I1216 14:20:40.881183 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/container-replicator/0.log" Dec 16 14:20:40 crc kubenswrapper[4757]: I1216 14:20:40.918683 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/container-server/0.log" Dec 16 14:20:41 crc kubenswrapper[4757]: I1216 14:20:41.069729 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/object-expirer/0.log" Dec 16 14:20:41 crc kubenswrapper[4757]: I1216 14:20:41.075045 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/container-updater/0.log" Dec 16 14:20:41 crc kubenswrapper[4757]: I1216 14:20:41.131617 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/object-auditor/0.log" Dec 16 14:20:41 crc kubenswrapper[4757]: I1216 14:20:41.258210 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/object-server/0.log" Dec 16 14:20:41 crc kubenswrapper[4757]: I1216 14:20:41.280757 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/rsync/0.log" Dec 16 14:20:41 crc kubenswrapper[4757]: I1216 14:20:41.315951 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/object-replicator/0.log" Dec 16 14:20:41 crc kubenswrapper[4757]: I1216 14:20:41.347618 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/object-updater/0.log" Dec 16 14:20:41 crc kubenswrapper[4757]: I1216 14:20:41.383624 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_510c5136-4ca0-49c9-ba30-1cafb624d71f/swift-recon-cron/0.log" Dec 16 14:20:41 crc kubenswrapper[4757]: I1216 14:20:41.703788 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-2q9tn_eb89db22-f667-4563-9468-97cd48c1da89/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:20:41 crc kubenswrapper[4757]: I1216 14:20:41.730680 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d2802d44-5cd2-4f45-80b0-d423d3ab6ea8/tempest-tests-tempest-tests-runner/0.log" Dec 16 14:20:41 crc kubenswrapper[4757]: I1216 14:20:41.897212 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_20af979c-4f0b-44d5-946c-fa6138ee9539/test-operator-logs-container/0.log" Dec 16 14:20:41 crc kubenswrapper[4757]: I1216 14:20:41.985818 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-nht9n_948d5531-d301-46c5-ac1a-882ceee8df96/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 14:20:51 crc kubenswrapper[4757]: I1216 14:20:51.181465 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 14:20:51 crc kubenswrapper[4757]: I1216 14:20:51.181905 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 14:20:51 crc kubenswrapper[4757]: I1216 14:20:51.181945 4757 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" Dec 16 14:20:51 crc kubenswrapper[4757]: I1216 14:20:51.182611 4757 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1"} pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 14:20:51 crc kubenswrapper[4757]: I1216 14:20:51.182656 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" containerID="cri-o://16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" gracePeriod=600 Dec 16 14:20:51 crc kubenswrapper[4757]: E1216 14:20:51.304092 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:20:51 crc kubenswrapper[4757]: I1216 14:20:51.759933 4757 generic.go:334] "Generic (PLEG): container finished" podID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" exitCode=0 Dec 16 14:20:51 crc kubenswrapper[4757]: I1216 14:20:51.760028 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerDied","Data":"16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1"} Dec 16 14:20:51 crc kubenswrapper[4757]: I1216 14:20:51.760416 4757 scope.go:117] "RemoveContainer" containerID="c86c64d9b5110260e0da79d803ebceec5b850120d3c283cb47799b3ce87c4098" Dec 16 14:20:51 crc kubenswrapper[4757]: I1216 14:20:51.761057 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:20:51 crc kubenswrapper[4757]: E1216 14:20:51.761455 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:21:03 crc kubenswrapper[4757]: I1216 14:21:03.948980 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:21:03 crc kubenswrapper[4757]: E1216 14:21:03.949845 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:21:09 crc kubenswrapper[4757]: I1216 14:21:09.922690 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp_62758af3-0127-42e3-a06f-0ba9ff4452c4/util/0.log" Dec 16 14:21:10 crc kubenswrapper[4757]: I1216 14:21:10.123415 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp_62758af3-0127-42e3-a06f-0ba9ff4452c4/util/0.log" Dec 16 14:21:10 crc kubenswrapper[4757]: I1216 14:21:10.139648 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp_62758af3-0127-42e3-a06f-0ba9ff4452c4/pull/0.log" Dec 16 14:21:10 crc kubenswrapper[4757]: I1216 14:21:10.144655 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp_62758af3-0127-42e3-a06f-0ba9ff4452c4/pull/0.log" Dec 16 14:21:10 crc kubenswrapper[4757]: I1216 14:21:10.419749 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp_62758af3-0127-42e3-a06f-0ba9ff4452c4/pull/0.log" Dec 16 14:21:10 crc kubenswrapper[4757]: I1216 14:21:10.471117 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp_62758af3-0127-42e3-a06f-0ba9ff4452c4/extract/0.log" Dec 16 14:21:10 crc kubenswrapper[4757]: I1216 14:21:10.476274 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_694c9ad89f42de7c4a1f20ba182dee372cbaff9f08ac70576cb15d07b4brcwp_62758af3-0127-42e3-a06f-0ba9ff4452c4/util/0.log" Dec 16 14:21:10 crc kubenswrapper[4757]: I1216 14:21:10.702495 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-95949466-sgvcj_34c17eba-d6e6-4399-a0a0-f25ef7a89fb9/manager/0.log" Dec 16 14:21:10 crc kubenswrapper[4757]: I1216 14:21:10.708094 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5f98b4754f-4z9jz_3717fd56-4339-4ad6-940d-b5023c76d32f/manager/1.log" Dec 16 14:21:10 crc kubenswrapper[4757]: I1216 14:21:10.809763 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5f98b4754f-4z9jz_3717fd56-4339-4ad6-940d-b5023c76d32f/manager/0.log" Dec 16 14:21:10 crc kubenswrapper[4757]: I1216 14:21:10.987058 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66f8b87655-t9vm8_a6449c1f-3695-445d-90b0-64b4c79cde05/manager/0.log" Dec 16 14:21:10 crc kubenswrapper[4757]: I1216 14:21:10.994252 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66f8b87655-t9vm8_a6449c1f-3695-445d-90b0-64b4c79cde05/manager/1.log" Dec 16 14:21:11 crc kubenswrapper[4757]: I1216 14:21:11.226683 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-767f9d7567-qtkjq_fbd5f746-9483-455c-988e-2e882623d09e/manager/0.log" Dec 16 14:21:11 crc kubenswrapper[4757]: I1216 14:21:11.240227 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-59b8dcb766-lvxk6_b46e5138-a221-489d-9d7a-a54cf3938d64/manager/0.log" Dec 16 14:21:11 crc kubenswrapper[4757]: I1216 14:21:11.447251 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6ccf486b9-w9gps_6c815add-abbd-4655-b257-d50ab074414a/manager/0.log" Dec 16 14:21:11 crc kubenswrapper[4757]: I1216 14:21:11.775390 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-f458558d7-45bgz_6333c537-0505-48c0-b197-a609084a2a2c/manager/1.log" Dec 16 14:21:11 crc kubenswrapper[4757]: I1216 14:21:11.777976 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-f458558d7-45bgz_6333c537-0505-48c0-b197-a609084a2a2c/manager/0.log" Dec 16 14:21:11 crc kubenswrapper[4757]: I1216 14:21:11.811215 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84b495f78-fw274_904525e7-6f82-4fbf-928a-99194a97829a/manager/0.log" Dec 16 14:21:12 crc kubenswrapper[4757]: I1216 14:21:12.069431 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5fdd9786f7-wg8wq_eb0b72d5-d126-4513-8fbb-c7eed0a8f5e0/manager/0.log" Dec 16 14:21:12 crc kubenswrapper[4757]: I1216 14:21:12.071347 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5c7cbf548f-wgrph_72a6aea3-2309-4c98-802b-416feed1ba0f/manager/0.log" Dec 16 14:21:12 crc kubenswrapper[4757]: I1216 14:21:12.288122 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f76f4954c-psxvw_75d829d5-a3cd-48c6-8aff-07f7d325b4f9/manager/0.log" Dec 16 14:21:12 crc kubenswrapper[4757]: I1216 14:21:12.376850 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7cd87b778f-v7jvg_545806dc-d916-4704-bc27-f5a46915fb56/manager/0.log" Dec 16 14:21:12 crc kubenswrapper[4757]: I1216 14:21:12.559629 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5fbbf8b6cc-bqfgm_83154b06-c2df-4a44-9a33-4971cd60add3/manager/0.log" Dec 16 14:21:12 crc kubenswrapper[4757]: I1216 14:21:12.589333 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-68c649d9d-dxgr9_e9f15431-d8cd-408d-8169-e06457cabccc/manager/1.log" Dec 16 14:21:12 crc kubenswrapper[4757]: I1216 14:21:12.624901 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-68c649d9d-dxgr9_e9f15431-d8cd-408d-8169-e06457cabccc/manager/0.log" Dec 16 14:21:12 crc kubenswrapper[4757]: I1216 14:21:12.803735 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9_67c1a6c7-35d1-48a9-a058-13e5d5599fe7/manager/1.log" Dec 16 14:21:12 crc kubenswrapper[4757]: I1216 14:21:12.883197 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7b67c7f6c59p5q9_67c1a6c7-35d1-48a9-a058-13e5d5599fe7/manager/0.log" Dec 16 14:21:13 crc kubenswrapper[4757]: I1216 14:21:13.351440 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-cn9nk_b56f5192-de72-41a1-b733-edd456541eda/registry-server/0.log" Dec 16 14:21:13 crc kubenswrapper[4757]: I1216 14:21:13.545787 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-56fbb56c9b-wtj5t_4e6ed2fb-5d23-44a6-9967-6cc3ed88a1b0/operator/0.log" Dec 16 14:21:13 crc kubenswrapper[4757]: I1216 14:21:13.654928 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bf6d4f946-x72bf_60821702-232d-4eb4-b70f-15e87e070aed/manager/0.log" Dec 16 14:21:13 crc kubenswrapper[4757]: I1216 14:21:13.904570 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8665b56d78-bk25g_28ec7b61-2e0c-4ad7-8569-eeb5973b976d/manager/0.log" Dec 16 14:21:14 crc kubenswrapper[4757]: I1216 14:21:14.090793 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-gcxgg_7432087f-983f-4b3d-af98-40238ceba951/operator/0.log" Dec 16 14:21:14 crc kubenswrapper[4757]: I1216 14:21:14.244944 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5c6df8f9-wgxxx_120aab20-c2fb-441d-9c07-bd05c0678a11/manager/0.log" Dec 16 14:21:14 crc kubenswrapper[4757]: I1216 14:21:14.254109 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-554cfb9dfb-d6w2k_7b6693c4-d7ad-4edc-ba55-baa2fea5094a/manager/0.log" Dec 16 14:21:14 crc kubenswrapper[4757]: I1216 14:21:14.429536 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-97d456b9-85zkh_ac2d53dd-c297-44b1-bcb1-a3025530eb5c/manager/0.log" Dec 16 14:21:14 crc kubenswrapper[4757]: I1216 14:21:14.513024 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-756ccf86c7-w6kx8_42d952f0-a650-484d-9e6b-b1c6c0f252dc/manager/0.log" Dec 16 14:21:14 crc kubenswrapper[4757]: I1216 14:21:14.554048 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-756ccf86c7-w6kx8_42d952f0-a650-484d-9e6b-b1c6c0f252dc/manager/1.log" Dec 16 14:21:14 crc kubenswrapper[4757]: I1216 14:21:14.652577 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-55f78b7c4c-dr2qv_ee4f2b54-1f7b-469d-9d41-ad4d57c3bf89/manager/0.log" Dec 16 14:21:17 crc kubenswrapper[4757]: I1216 14:21:17.953849 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:21:17 crc kubenswrapper[4757]: E1216 14:21:17.955717 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:21:30 crc kubenswrapper[4757]: I1216 14:21:30.948649 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:21:30 crc kubenswrapper[4757]: E1216 14:21:30.949464 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:21:33 crc kubenswrapper[4757]: I1216 14:21:33.858664 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qr4dp_cf6bf9c1-5d43-4ab3-a38f-d96308345ff4/control-plane-machine-set-operator/0.log" Dec 16 14:21:34 crc kubenswrapper[4757]: I1216 14:21:34.023669 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wg8vk_ff965e39-8bf4-40d8-b7af-702f0c47bbb4/machine-api-operator/0.log" Dec 16 14:21:34 crc kubenswrapper[4757]: I1216 14:21:34.040855 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wg8vk_ff965e39-8bf4-40d8-b7af-702f0c47bbb4/kube-rbac-proxy/0.log" Dec 16 14:21:42 crc kubenswrapper[4757]: I1216 14:21:42.948586 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:21:42 crc kubenswrapper[4757]: E1216 14:21:42.949369 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:21:46 crc kubenswrapper[4757]: I1216 14:21:46.122489 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-dd8lt_fc655451-6b29-42c9-836d-ae8ae9a5d77b/cert-manager-controller/0.log" Dec 16 14:21:46 crc kubenswrapper[4757]: I1216 14:21:46.344608 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-82cdt_1a132cd7-a7ae-476a-ad05-9a2ec1981349/cert-manager-cainjector/0.log" Dec 16 14:21:46 crc kubenswrapper[4757]: I1216 14:21:46.442817 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-cgxqx_04fe3c89-14a7-4830-b290-538d3ae20a12/cert-manager-webhook/0.log" Dec 16 14:21:57 crc kubenswrapper[4757]: I1216 14:21:57.948860 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:21:57 crc kubenswrapper[4757]: E1216 14:21:57.949618 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:21:59 crc kubenswrapper[4757]: I1216 14:21:59.486242 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6ff7998486-b9shc_536f7375-828d-41f1-afd3-509271873ae2/nmstate-console-plugin/0.log" Dec 16 14:21:59 crc kubenswrapper[4757]: I1216 14:21:59.675587 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-8dspc_7a0c9c35-f5d3-4503-831d-840fdc460911/nmstate-handler/0.log" Dec 16 14:21:59 crc kubenswrapper[4757]: I1216 14:21:59.761849 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-gj6pf_e6a65757-6cab-4b58-a399-d186414d6485/kube-rbac-proxy/0.log" Dec 16 14:21:59 crc kubenswrapper[4757]: I1216 14:21:59.818693 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-gj6pf_e6a65757-6cab-4b58-a399-d186414d6485/nmstate-metrics/0.log" Dec 16 14:21:59 crc kubenswrapper[4757]: I1216 14:21:59.985713 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6769fb99d-crfw5_13ed71f2-85e0-4dc6-94c1-82e12982c67f/nmstate-operator/0.log" Dec 16 14:22:00 crc kubenswrapper[4757]: I1216 14:22:00.090843 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-f8fb84555-l6pg2_97bad1b0-8ffc-4e2e-ac9f-d6f5ed747922/nmstate-webhook/0.log" Dec 16 14:22:10 crc kubenswrapper[4757]: I1216 14:22:10.948630 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:22:10 crc kubenswrapper[4757]: E1216 14:22:10.949295 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:22:16 crc kubenswrapper[4757]: I1216 14:22:16.067388 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-jwvt9_dd784875-6828-4554-8791-24182d80b82f/controller/0.log" Dec 16 14:22:16 crc kubenswrapper[4757]: I1216 14:22:16.134119 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-jwvt9_dd784875-6828-4554-8791-24182d80b82f/kube-rbac-proxy/0.log" Dec 16 14:22:16 crc kubenswrapper[4757]: I1216 14:22:16.213743 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-frr-files/0.log" Dec 16 14:22:16 crc kubenswrapper[4757]: I1216 14:22:16.393612 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-reloader/0.log" Dec 16 14:22:16 crc kubenswrapper[4757]: I1216 14:22:16.427473 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-frr-files/0.log" Dec 16 14:22:16 crc kubenswrapper[4757]: I1216 14:22:16.439045 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-metrics/0.log" Dec 16 14:22:16 crc kubenswrapper[4757]: I1216 14:22:16.462214 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-reloader/0.log" Dec 16 14:22:16 crc kubenswrapper[4757]: I1216 14:22:16.689088 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-reloader/0.log" Dec 16 14:22:16 crc kubenswrapper[4757]: I1216 14:22:16.691306 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-frr-files/0.log" Dec 16 14:22:16 crc kubenswrapper[4757]: I1216 14:22:16.806598 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-metrics/0.log" Dec 16 14:22:16 crc kubenswrapper[4757]: I1216 14:22:16.829954 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-metrics/0.log" Dec 16 14:22:16 crc kubenswrapper[4757]: I1216 14:22:16.998860 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-reloader/0.log" Dec 16 14:22:17 crc kubenswrapper[4757]: I1216 14:22:17.019517 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-frr-files/0.log" Dec 16 14:22:17 crc kubenswrapper[4757]: I1216 14:22:17.036829 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/controller/0.log" Dec 16 14:22:17 crc kubenswrapper[4757]: I1216 14:22:17.041899 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/cp-metrics/0.log" Dec 16 14:22:17 crc kubenswrapper[4757]: I1216 14:22:17.221131 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/frr-metrics/0.log" Dec 16 14:22:17 crc kubenswrapper[4757]: I1216 14:22:17.293336 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/kube-rbac-proxy-frr/0.log" Dec 16 14:22:17 crc kubenswrapper[4757]: I1216 14:22:17.303793 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/kube-rbac-proxy/0.log" Dec 16 14:22:17 crc kubenswrapper[4757]: I1216 14:22:17.590709 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/reloader/0.log" Dec 16 14:22:17 crc kubenswrapper[4757]: I1216 14:22:17.626062 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-cwgds_97dcbeac-b6c6-4892-91fd-4cd45b2b1f9b/frr-k8s-webhook-server/0.log" Dec 16 14:22:17 crc kubenswrapper[4757]: I1216 14:22:17.804971 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84c894d85c-x59lp_019f84e1-6fee-4829-a087-c756c955060a/manager/1.log" Dec 16 14:22:17 crc kubenswrapper[4757]: I1216 14:22:17.947723 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84c894d85c-x59lp_019f84e1-6fee-4829-a087-c756c955060a/manager/0.log" Dec 16 14:22:18 crc kubenswrapper[4757]: I1216 14:22:18.213566 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7b74cd5c78-9s4s8_93d50c8f-84bf-4f97-96ab-98cbbd370476/webhook-server/0.log" Dec 16 14:22:18 crc kubenswrapper[4757]: I1216 14:22:18.404850 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-996kc_8074db35-8766-4cc2-bc06-be8a150f92e9/kube-rbac-proxy/0.log" Dec 16 14:22:18 crc kubenswrapper[4757]: I1216 14:22:18.613331 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dfprv_0ea2db10-97ca-4173-9766-c34220e3958b/frr/0.log" Dec 16 14:22:18 crc kubenswrapper[4757]: I1216 14:22:18.885546 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-996kc_8074db35-8766-4cc2-bc06-be8a150f92e9/speaker/0.log" Dec 16 14:22:25 crc kubenswrapper[4757]: I1216 14:22:25.949208 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:22:25 crc kubenswrapper[4757]: E1216 14:22:25.951710 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:22:31 crc kubenswrapper[4757]: I1216 14:22:31.405603 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx_6bdd3557-11e0-4bc6-b282-80d9d44ecac4/util/0.log" Dec 16 14:22:31 crc kubenswrapper[4757]: I1216 14:22:31.649245 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx_6bdd3557-11e0-4bc6-b282-80d9d44ecac4/pull/0.log" Dec 16 14:22:31 crc kubenswrapper[4757]: I1216 14:22:31.684050 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx_6bdd3557-11e0-4bc6-b282-80d9d44ecac4/util/0.log" Dec 16 14:22:31 crc kubenswrapper[4757]: I1216 14:22:31.692624 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx_6bdd3557-11e0-4bc6-b282-80d9d44ecac4/pull/0.log" Dec 16 14:22:31 crc kubenswrapper[4757]: I1216 14:22:31.888361 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx_6bdd3557-11e0-4bc6-b282-80d9d44ecac4/util/0.log" Dec 16 14:22:31 crc kubenswrapper[4757]: I1216 14:22:31.892377 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx_6bdd3557-11e0-4bc6-b282-80d9d44ecac4/extract/0.log" Dec 16 14:22:31 crc kubenswrapper[4757]: I1216 14:22:31.903211 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4225vx_6bdd3557-11e0-4bc6-b282-80d9d44ecac4/pull/0.log" Dec 16 14:22:32 crc kubenswrapper[4757]: I1216 14:22:32.056554 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct_4aeee37c-9135-4d1a-8e52-181ea394acc6/util/0.log" Dec 16 14:22:32 crc kubenswrapper[4757]: I1216 14:22:32.250574 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct_4aeee37c-9135-4d1a-8e52-181ea394acc6/pull/0.log" Dec 16 14:22:32 crc kubenswrapper[4757]: I1216 14:22:32.317219 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct_4aeee37c-9135-4d1a-8e52-181ea394acc6/pull/0.log" Dec 16 14:22:32 crc kubenswrapper[4757]: I1216 14:22:32.331856 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct_4aeee37c-9135-4d1a-8e52-181ea394acc6/util/0.log" Dec 16 14:22:32 crc kubenswrapper[4757]: I1216 14:22:32.509651 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct_4aeee37c-9135-4d1a-8e52-181ea394acc6/util/0.log" Dec 16 14:22:32 crc kubenswrapper[4757]: I1216 14:22:32.514165 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct_4aeee37c-9135-4d1a-8e52-181ea394acc6/pull/0.log" Dec 16 14:22:32 crc kubenswrapper[4757]: I1216 14:22:32.547580 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa857tct_4aeee37c-9135-4d1a-8e52-181ea394acc6/extract/0.log" Dec 16 14:22:32 crc kubenswrapper[4757]: I1216 14:22:32.720370 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vnn8x_91c99d1e-682b-4fc3-a4de-594f16bfb4d7/extract-utilities/0.log" Dec 16 14:22:32 crc kubenswrapper[4757]: I1216 14:22:32.927079 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vnn8x_91c99d1e-682b-4fc3-a4de-594f16bfb4d7/extract-content/0.log" Dec 16 14:22:32 crc kubenswrapper[4757]: I1216 14:22:32.927723 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vnn8x_91c99d1e-682b-4fc3-a4de-594f16bfb4d7/extract-utilities/0.log" Dec 16 14:22:32 crc kubenswrapper[4757]: I1216 14:22:32.989200 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vnn8x_91c99d1e-682b-4fc3-a4de-594f16bfb4d7/extract-content/0.log" Dec 16 14:22:33 crc kubenswrapper[4757]: I1216 14:22:33.141149 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vnn8x_91c99d1e-682b-4fc3-a4de-594f16bfb4d7/extract-content/0.log" Dec 16 14:22:33 crc kubenswrapper[4757]: I1216 14:22:33.189566 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vnn8x_91c99d1e-682b-4fc3-a4de-594f16bfb4d7/extract-utilities/0.log" Dec 16 14:22:33 crc kubenswrapper[4757]: I1216 14:22:33.452836 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9fpvh_87336683-d0ae-4df9-91a8-881fa54e49b9/extract-utilities/0.log" Dec 16 14:22:33 crc kubenswrapper[4757]: I1216 14:22:33.730660 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9fpvh_87336683-d0ae-4df9-91a8-881fa54e49b9/extract-utilities/0.log" Dec 16 14:22:33 crc kubenswrapper[4757]: I1216 14:22:33.774544 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9fpvh_87336683-d0ae-4df9-91a8-881fa54e49b9/extract-content/0.log" Dec 16 14:22:33 crc kubenswrapper[4757]: I1216 14:22:33.811605 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vnn8x_91c99d1e-682b-4fc3-a4de-594f16bfb4d7/registry-server/0.log" Dec 16 14:22:33 crc kubenswrapper[4757]: I1216 14:22:33.875963 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9fpvh_87336683-d0ae-4df9-91a8-881fa54e49b9/extract-content/0.log" Dec 16 14:22:34 crc kubenswrapper[4757]: I1216 14:22:34.063054 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9fpvh_87336683-d0ae-4df9-91a8-881fa54e49b9/extract-content/0.log" Dec 16 14:22:34 crc kubenswrapper[4757]: I1216 14:22:34.102736 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9fpvh_87336683-d0ae-4df9-91a8-881fa54e49b9/extract-utilities/0.log" Dec 16 14:22:34 crc kubenswrapper[4757]: I1216 14:22:34.391059 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-z52vp_b40bf055-8b99-4c86-9e45-ed2253aa09a1/marketplace-operator/0.log" Dec 16 14:22:34 crc kubenswrapper[4757]: I1216 14:22:34.543494 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tw282_52fab5a3-5b60-47e2-a517-37c8d9adc3c1/extract-utilities/0.log" Dec 16 14:22:34 crc kubenswrapper[4757]: I1216 14:22:34.724667 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tw282_52fab5a3-5b60-47e2-a517-37c8d9adc3c1/extract-utilities/0.log" Dec 16 14:22:34 crc kubenswrapper[4757]: I1216 14:22:34.798186 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9fpvh_87336683-d0ae-4df9-91a8-881fa54e49b9/registry-server/0.log" Dec 16 14:22:34 crc kubenswrapper[4757]: I1216 14:22:34.851389 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tw282_52fab5a3-5b60-47e2-a517-37c8d9adc3c1/extract-content/0.log" Dec 16 14:22:34 crc kubenswrapper[4757]: I1216 14:22:34.889618 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tw282_52fab5a3-5b60-47e2-a517-37c8d9adc3c1/extract-content/0.log" Dec 16 14:22:35 crc kubenswrapper[4757]: I1216 14:22:35.078349 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tw282_52fab5a3-5b60-47e2-a517-37c8d9adc3c1/extract-utilities/0.log" Dec 16 14:22:35 crc kubenswrapper[4757]: I1216 14:22:35.162768 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tw282_52fab5a3-5b60-47e2-a517-37c8d9adc3c1/extract-content/0.log" Dec 16 14:22:35 crc kubenswrapper[4757]: I1216 14:22:35.325785 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tw282_52fab5a3-5b60-47e2-a517-37c8d9adc3c1/registry-server/0.log" Dec 16 14:22:35 crc kubenswrapper[4757]: I1216 14:22:35.401163 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8djct_4e45bdb9-422b-4864-b3fc-4aeab70108a3/extract-utilities/0.log" Dec 16 14:22:35 crc kubenswrapper[4757]: I1216 14:22:35.560456 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8djct_4e45bdb9-422b-4864-b3fc-4aeab70108a3/extract-utilities/0.log" Dec 16 14:22:35 crc kubenswrapper[4757]: I1216 14:22:35.613246 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8djct_4e45bdb9-422b-4864-b3fc-4aeab70108a3/extract-content/0.log" Dec 16 14:22:35 crc kubenswrapper[4757]: I1216 14:22:35.622806 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8djct_4e45bdb9-422b-4864-b3fc-4aeab70108a3/extract-content/0.log" Dec 16 14:22:35 crc kubenswrapper[4757]: I1216 14:22:35.852657 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8djct_4e45bdb9-422b-4864-b3fc-4aeab70108a3/extract-content/0.log" Dec 16 14:22:35 crc kubenswrapper[4757]: I1216 14:22:35.881104 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8djct_4e45bdb9-422b-4864-b3fc-4aeab70108a3/extract-utilities/0.log" Dec 16 14:22:36 crc kubenswrapper[4757]: I1216 14:22:36.199231 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8djct_4e45bdb9-422b-4864-b3fc-4aeab70108a3/registry-server/0.log" Dec 16 14:22:36 crc kubenswrapper[4757]: I1216 14:22:36.949849 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:22:36 crc kubenswrapper[4757]: E1216 14:22:36.950203 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:22:48 crc kubenswrapper[4757]: I1216 14:22:48.948564 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:22:48 crc kubenswrapper[4757]: E1216 14:22:48.949359 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:23:03 crc kubenswrapper[4757]: I1216 14:23:03.949186 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:23:03 crc kubenswrapper[4757]: E1216 14:23:03.950091 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:23:07 crc kubenswrapper[4757]: E1216 14:23:07.935038 4757 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.110:41106->38.102.83.110:43565: write tcp 38.102.83.110:41106->38.102.83.110:43565: write: broken pipe Dec 16 14:23:14 crc kubenswrapper[4757]: I1216 14:23:14.956916 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:23:14 crc kubenswrapper[4757]: E1216 14:23:14.957750 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:23:27 crc kubenswrapper[4757]: I1216 14:23:27.949253 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:23:27 crc kubenswrapper[4757]: E1216 14:23:27.950092 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:23:39 crc kubenswrapper[4757]: I1216 14:23:39.948573 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:23:39 crc kubenswrapper[4757]: E1216 14:23:39.949392 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:23:53 crc kubenswrapper[4757]: I1216 14:23:53.948940 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:23:53 crc kubenswrapper[4757]: E1216 14:23:53.949828 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:24:04 crc kubenswrapper[4757]: I1216 14:24:04.956999 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:24:04 crc kubenswrapper[4757]: E1216 14:24:04.958280 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:24:18 crc kubenswrapper[4757]: I1216 14:24:18.950468 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:24:18 crc kubenswrapper[4757]: E1216 14:24:18.951345 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:24:32 crc kubenswrapper[4757]: I1216 14:24:32.949677 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:24:32 crc kubenswrapper[4757]: E1216 14:24:32.950500 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:24:43 crc kubenswrapper[4757]: I1216 14:24:43.949898 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:24:43 crc kubenswrapper[4757]: E1216 14:24:43.951779 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:24:54 crc kubenswrapper[4757]: I1216 14:24:54.057857 4757 generic.go:334] "Generic (PLEG): container finished" podID="3803726d-b2f2-4424-9e53-0c9186d1b450" containerID="3ade8f9d97efe6903646eb5e774031cecf3107d6408b7479f6040d272af062bf" exitCode=0 Dec 16 14:24:54 crc kubenswrapper[4757]: I1216 14:24:54.058058 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlqnx/must-gather-qxbnt" event={"ID":"3803726d-b2f2-4424-9e53-0c9186d1b450","Type":"ContainerDied","Data":"3ade8f9d97efe6903646eb5e774031cecf3107d6408b7479f6040d272af062bf"} Dec 16 14:24:54 crc kubenswrapper[4757]: I1216 14:24:54.059735 4757 scope.go:117] "RemoveContainer" containerID="3ade8f9d97efe6903646eb5e774031cecf3107d6408b7479f6040d272af062bf" Dec 16 14:24:54 crc kubenswrapper[4757]: I1216 14:24:54.959584 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:24:54 crc kubenswrapper[4757]: E1216 14:24:54.959980 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:24:55 crc kubenswrapper[4757]: I1216 14:24:55.095794 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wlqnx_must-gather-qxbnt_3803726d-b2f2-4424-9e53-0c9186d1b450/gather/0.log" Dec 16 14:25:06 crc kubenswrapper[4757]: I1216 14:25:06.949993 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:25:06 crc kubenswrapper[4757]: E1216 14:25:06.950987 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:25:08 crc kubenswrapper[4757]: I1216 14:25:08.208560 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wlqnx/must-gather-qxbnt"] Dec 16 14:25:08 crc kubenswrapper[4757]: I1216 14:25:08.209223 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-wlqnx/must-gather-qxbnt" podUID="3803726d-b2f2-4424-9e53-0c9186d1b450" containerName="copy" containerID="cri-o://6763c930076b4cfa30b02114c52753334df7500006b762b58710e8a7db9d36d8" gracePeriod=2 Dec 16 14:25:08 crc kubenswrapper[4757]: I1216 14:25:08.217766 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wlqnx/must-gather-qxbnt"] Dec 16 14:25:08 crc kubenswrapper[4757]: I1216 14:25:08.649841 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wlqnx_must-gather-qxbnt_3803726d-b2f2-4424-9e53-0c9186d1b450/copy/0.log" Dec 16 14:25:08 crc kubenswrapper[4757]: I1216 14:25:08.651233 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlqnx/must-gather-qxbnt" Dec 16 14:25:08 crc kubenswrapper[4757]: I1216 14:25:08.697065 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3803726d-b2f2-4424-9e53-0c9186d1b450-must-gather-output\") pod \"3803726d-b2f2-4424-9e53-0c9186d1b450\" (UID: \"3803726d-b2f2-4424-9e53-0c9186d1b450\") " Dec 16 14:25:08 crc kubenswrapper[4757]: I1216 14:25:08.697144 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbxss\" (UniqueName: \"kubernetes.io/projected/3803726d-b2f2-4424-9e53-0c9186d1b450-kube-api-access-rbxss\") pod \"3803726d-b2f2-4424-9e53-0c9186d1b450\" (UID: \"3803726d-b2f2-4424-9e53-0c9186d1b450\") " Dec 16 14:25:08 crc kubenswrapper[4757]: I1216 14:25:08.703110 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3803726d-b2f2-4424-9e53-0c9186d1b450-kube-api-access-rbxss" (OuterVolumeSpecName: "kube-api-access-rbxss") pod "3803726d-b2f2-4424-9e53-0c9186d1b450" (UID: "3803726d-b2f2-4424-9e53-0c9186d1b450"). InnerVolumeSpecName "kube-api-access-rbxss". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:25:08 crc kubenswrapper[4757]: I1216 14:25:08.799234 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbxss\" (UniqueName: \"kubernetes.io/projected/3803726d-b2f2-4424-9e53-0c9186d1b450-kube-api-access-rbxss\") on node \"crc\" DevicePath \"\"" Dec 16 14:25:08 crc kubenswrapper[4757]: I1216 14:25:08.882499 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3803726d-b2f2-4424-9e53-0c9186d1b450-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3803726d-b2f2-4424-9e53-0c9186d1b450" (UID: "3803726d-b2f2-4424-9e53-0c9186d1b450"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:25:08 crc kubenswrapper[4757]: I1216 14:25:08.901953 4757 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3803726d-b2f2-4424-9e53-0c9186d1b450-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 16 14:25:08 crc kubenswrapper[4757]: I1216 14:25:08.960786 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3803726d-b2f2-4424-9e53-0c9186d1b450" path="/var/lib/kubelet/pods/3803726d-b2f2-4424-9e53-0c9186d1b450/volumes" Dec 16 14:25:09 crc kubenswrapper[4757]: I1216 14:25:09.238342 4757 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wlqnx_must-gather-qxbnt_3803726d-b2f2-4424-9e53-0c9186d1b450/copy/0.log" Dec 16 14:25:09 crc kubenswrapper[4757]: I1216 14:25:09.239561 4757 generic.go:334] "Generic (PLEG): container finished" podID="3803726d-b2f2-4424-9e53-0c9186d1b450" containerID="6763c930076b4cfa30b02114c52753334df7500006b762b58710e8a7db9d36d8" exitCode=143 Dec 16 14:25:09 crc kubenswrapper[4757]: I1216 14:25:09.239612 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlqnx/must-gather-qxbnt" Dec 16 14:25:09 crc kubenswrapper[4757]: I1216 14:25:09.239649 4757 scope.go:117] "RemoveContainer" containerID="6763c930076b4cfa30b02114c52753334df7500006b762b58710e8a7db9d36d8" Dec 16 14:25:09 crc kubenswrapper[4757]: I1216 14:25:09.257587 4757 scope.go:117] "RemoveContainer" containerID="3ade8f9d97efe6903646eb5e774031cecf3107d6408b7479f6040d272af062bf" Dec 16 14:25:09 crc kubenswrapper[4757]: I1216 14:25:09.304258 4757 scope.go:117] "RemoveContainer" containerID="6763c930076b4cfa30b02114c52753334df7500006b762b58710e8a7db9d36d8" Dec 16 14:25:09 crc kubenswrapper[4757]: E1216 14:25:09.304678 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6763c930076b4cfa30b02114c52753334df7500006b762b58710e8a7db9d36d8\": container with ID starting with 6763c930076b4cfa30b02114c52753334df7500006b762b58710e8a7db9d36d8 not found: ID does not exist" containerID="6763c930076b4cfa30b02114c52753334df7500006b762b58710e8a7db9d36d8" Dec 16 14:25:09 crc kubenswrapper[4757]: I1216 14:25:09.304710 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6763c930076b4cfa30b02114c52753334df7500006b762b58710e8a7db9d36d8"} err="failed to get container status \"6763c930076b4cfa30b02114c52753334df7500006b762b58710e8a7db9d36d8\": rpc error: code = NotFound desc = could not find container \"6763c930076b4cfa30b02114c52753334df7500006b762b58710e8a7db9d36d8\": container with ID starting with 6763c930076b4cfa30b02114c52753334df7500006b762b58710e8a7db9d36d8 not found: ID does not exist" Dec 16 14:25:09 crc kubenswrapper[4757]: I1216 14:25:09.304738 4757 scope.go:117] "RemoveContainer" containerID="3ade8f9d97efe6903646eb5e774031cecf3107d6408b7479f6040d272af062bf" Dec 16 14:25:09 crc kubenswrapper[4757]: E1216 14:25:09.304928 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ade8f9d97efe6903646eb5e774031cecf3107d6408b7479f6040d272af062bf\": container with ID starting with 3ade8f9d97efe6903646eb5e774031cecf3107d6408b7479f6040d272af062bf not found: ID does not exist" containerID="3ade8f9d97efe6903646eb5e774031cecf3107d6408b7479f6040d272af062bf" Dec 16 14:25:09 crc kubenswrapper[4757]: I1216 14:25:09.304943 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ade8f9d97efe6903646eb5e774031cecf3107d6408b7479f6040d272af062bf"} err="failed to get container status \"3ade8f9d97efe6903646eb5e774031cecf3107d6408b7479f6040d272af062bf\": rpc error: code = NotFound desc = could not find container \"3ade8f9d97efe6903646eb5e774031cecf3107d6408b7479f6040d272af062bf\": container with ID starting with 3ade8f9d97efe6903646eb5e774031cecf3107d6408b7479f6040d272af062bf not found: ID does not exist" Dec 16 14:25:17 crc kubenswrapper[4757]: I1216 14:25:17.949025 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:25:17 crc kubenswrapper[4757]: E1216 14:25:17.950736 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:25:31 crc kubenswrapper[4757]: I1216 14:25:31.949611 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:25:31 crc kubenswrapper[4757]: E1216 14:25:31.950392 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:25:33 crc kubenswrapper[4757]: I1216 14:25:33.115549 4757 scope.go:117] "RemoveContainer" containerID="8f6cca2bcb9278eb9a3b2d2d1313a6c48b2f4b9c17f76eccc1cfd69256c17eb7" Dec 16 14:25:42 crc kubenswrapper[4757]: I1216 14:25:42.949304 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:25:42 crc kubenswrapper[4757]: E1216 14:25:42.950173 4757 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tm6vt_openshift-machine-config-operator(43be7319-eac3-4e51-9560-e12d51e97ca6)\"" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" Dec 16 14:25:56 crc kubenswrapper[4757]: I1216 14:25:56.952306 4757 scope.go:117] "RemoveContainer" containerID="16ff55be5adfa9658fcf30408305469995ead72ce3968ee9b971d36fbf20bdd1" Dec 16 14:25:57 crc kubenswrapper[4757]: I1216 14:25:57.691174 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" event={"ID":"43be7319-eac3-4e51-9560-e12d51e97ca6","Type":"ContainerStarted","Data":"963f1ee90bc0f0e0a5304e4174eeced3711b56826b0b258f650793266974fbf2"} Dec 16 14:26:33 crc kubenswrapper[4757]: I1216 14:26:33.202906 4757 scope.go:117] "RemoveContainer" containerID="c0e9f172165a8649673865958ba5c0382f6e8eb4cb73e56a83d183a8bb6c5964" Dec 16 14:26:48 crc kubenswrapper[4757]: I1216 14:26:48.786054 4757 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vqg55"] Dec 16 14:26:48 crc kubenswrapper[4757]: E1216 14:26:48.787137 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3803726d-b2f2-4424-9e53-0c9186d1b450" containerName="copy" Dec 16 14:26:48 crc kubenswrapper[4757]: I1216 14:26:48.787157 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="3803726d-b2f2-4424-9e53-0c9186d1b450" containerName="copy" Dec 16 14:26:48 crc kubenswrapper[4757]: E1216 14:26:48.787188 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3803726d-b2f2-4424-9e53-0c9186d1b450" containerName="gather" Dec 16 14:26:48 crc kubenswrapper[4757]: I1216 14:26:48.787198 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="3803726d-b2f2-4424-9e53-0c9186d1b450" containerName="gather" Dec 16 14:26:48 crc kubenswrapper[4757]: E1216 14:26:48.787241 4757 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4497265-5045-4886-93c8-55b10c379d7f" containerName="container-00" Dec 16 14:26:48 crc kubenswrapper[4757]: I1216 14:26:48.787251 4757 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4497265-5045-4886-93c8-55b10c379d7f" containerName="container-00" Dec 16 14:26:48 crc kubenswrapper[4757]: I1216 14:26:48.787463 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="3803726d-b2f2-4424-9e53-0c9186d1b450" containerName="copy" Dec 16 14:26:48 crc kubenswrapper[4757]: I1216 14:26:48.787500 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="3803726d-b2f2-4424-9e53-0c9186d1b450" containerName="gather" Dec 16 14:26:48 crc kubenswrapper[4757]: I1216 14:26:48.787514 4757 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4497265-5045-4886-93c8-55b10c379d7f" containerName="container-00" Dec 16 14:26:48 crc kubenswrapper[4757]: I1216 14:26:48.789237 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vqg55" Dec 16 14:26:48 crc kubenswrapper[4757]: I1216 14:26:48.801229 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vqg55"] Dec 16 14:26:48 crc kubenswrapper[4757]: I1216 14:26:48.919213 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf1aeed3-7036-4537-b715-5c28a3e775be-catalog-content\") pod \"redhat-operators-vqg55\" (UID: \"bf1aeed3-7036-4537-b715-5c28a3e775be\") " pod="openshift-marketplace/redhat-operators-vqg55" Dec 16 14:26:48 crc kubenswrapper[4757]: I1216 14:26:48.919523 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r729\" (UniqueName: \"kubernetes.io/projected/bf1aeed3-7036-4537-b715-5c28a3e775be-kube-api-access-4r729\") pod \"redhat-operators-vqg55\" (UID: \"bf1aeed3-7036-4537-b715-5c28a3e775be\") " pod="openshift-marketplace/redhat-operators-vqg55" Dec 16 14:26:48 crc kubenswrapper[4757]: I1216 14:26:48.919656 4757 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf1aeed3-7036-4537-b715-5c28a3e775be-utilities\") pod \"redhat-operators-vqg55\" (UID: \"bf1aeed3-7036-4537-b715-5c28a3e775be\") " pod="openshift-marketplace/redhat-operators-vqg55" Dec 16 14:26:49 crc kubenswrapper[4757]: I1216 14:26:49.021121 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf1aeed3-7036-4537-b715-5c28a3e775be-catalog-content\") pod \"redhat-operators-vqg55\" (UID: \"bf1aeed3-7036-4537-b715-5c28a3e775be\") " pod="openshift-marketplace/redhat-operators-vqg55" Dec 16 14:26:49 crc kubenswrapper[4757]: I1216 14:26:49.021376 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r729\" (UniqueName: \"kubernetes.io/projected/bf1aeed3-7036-4537-b715-5c28a3e775be-kube-api-access-4r729\") pod \"redhat-operators-vqg55\" (UID: \"bf1aeed3-7036-4537-b715-5c28a3e775be\") " pod="openshift-marketplace/redhat-operators-vqg55" Dec 16 14:26:49 crc kubenswrapper[4757]: I1216 14:26:49.021474 4757 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf1aeed3-7036-4537-b715-5c28a3e775be-utilities\") pod \"redhat-operators-vqg55\" (UID: \"bf1aeed3-7036-4537-b715-5c28a3e775be\") " pod="openshift-marketplace/redhat-operators-vqg55" Dec 16 14:26:49 crc kubenswrapper[4757]: I1216 14:26:49.021830 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf1aeed3-7036-4537-b715-5c28a3e775be-utilities\") pod \"redhat-operators-vqg55\" (UID: \"bf1aeed3-7036-4537-b715-5c28a3e775be\") " pod="openshift-marketplace/redhat-operators-vqg55" Dec 16 14:26:49 crc kubenswrapper[4757]: I1216 14:26:49.021984 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf1aeed3-7036-4537-b715-5c28a3e775be-catalog-content\") pod \"redhat-operators-vqg55\" (UID: \"bf1aeed3-7036-4537-b715-5c28a3e775be\") " pod="openshift-marketplace/redhat-operators-vqg55" Dec 16 14:26:49 crc kubenswrapper[4757]: I1216 14:26:49.048696 4757 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r729\" (UniqueName: \"kubernetes.io/projected/bf1aeed3-7036-4537-b715-5c28a3e775be-kube-api-access-4r729\") pod \"redhat-operators-vqg55\" (UID: \"bf1aeed3-7036-4537-b715-5c28a3e775be\") " pod="openshift-marketplace/redhat-operators-vqg55" Dec 16 14:26:49 crc kubenswrapper[4757]: I1216 14:26:49.113969 4757 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vqg55" Dec 16 14:26:49 crc kubenswrapper[4757]: I1216 14:26:49.627023 4757 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vqg55"] Dec 16 14:26:50 crc kubenswrapper[4757]: I1216 14:26:50.207669 4757 generic.go:334] "Generic (PLEG): container finished" podID="bf1aeed3-7036-4537-b715-5c28a3e775be" containerID="58290e001cdbcf1df7366eccf5bf3d1ce8775b10e93d812169f2ea9a8f587c55" exitCode=0 Dec 16 14:26:50 crc kubenswrapper[4757]: I1216 14:26:50.207763 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqg55" event={"ID":"bf1aeed3-7036-4537-b715-5c28a3e775be","Type":"ContainerDied","Data":"58290e001cdbcf1df7366eccf5bf3d1ce8775b10e93d812169f2ea9a8f587c55"} Dec 16 14:26:50 crc kubenswrapper[4757]: I1216 14:26:50.207946 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqg55" event={"ID":"bf1aeed3-7036-4537-b715-5c28a3e775be","Type":"ContainerStarted","Data":"deef38df3b0e01025640e6ee322791d08b15435972c8e1aa59942c7a3a632737"} Dec 16 14:26:50 crc kubenswrapper[4757]: I1216 14:26:50.211311 4757 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 14:26:52 crc kubenswrapper[4757]: I1216 14:26:52.226686 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqg55" event={"ID":"bf1aeed3-7036-4537-b715-5c28a3e775be","Type":"ContainerStarted","Data":"e4e2d1bd4d2f47f1eb2035eca9f38d3a2ee248de5b7dcf689d58ab0ed2ce3816"} Dec 16 14:26:55 crc kubenswrapper[4757]: I1216 14:26:55.263443 4757 generic.go:334] "Generic (PLEG): container finished" podID="bf1aeed3-7036-4537-b715-5c28a3e775be" containerID="e4e2d1bd4d2f47f1eb2035eca9f38d3a2ee248de5b7dcf689d58ab0ed2ce3816" exitCode=0 Dec 16 14:26:55 crc kubenswrapper[4757]: I1216 14:26:55.263522 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqg55" event={"ID":"bf1aeed3-7036-4537-b715-5c28a3e775be","Type":"ContainerDied","Data":"e4e2d1bd4d2f47f1eb2035eca9f38d3a2ee248de5b7dcf689d58ab0ed2ce3816"} Dec 16 14:26:56 crc kubenswrapper[4757]: I1216 14:26:56.275997 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqg55" event={"ID":"bf1aeed3-7036-4537-b715-5c28a3e775be","Type":"ContainerStarted","Data":"5ff117904d07061adb3a3076de9afcdb3073ce3a28fb46ee3fd310934ef1c0f8"} Dec 16 14:26:56 crc kubenswrapper[4757]: I1216 14:26:56.301516 4757 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vqg55" podStartSLOduration=2.8021053350000003 podStartE2EDuration="8.301495412s" podCreationTimestamp="2025-12-16 14:26:48 +0000 UTC" firstStartedPulling="2025-12-16 14:26:50.211067917 +0000 UTC m=+5995.638811713" lastFinishedPulling="2025-12-16 14:26:55.710457994 +0000 UTC m=+6001.138201790" observedRunningTime="2025-12-16 14:26:56.292327146 +0000 UTC m=+6001.720070952" watchObservedRunningTime="2025-12-16 14:26:56.301495412 +0000 UTC m=+6001.729239208" Dec 16 14:26:59 crc kubenswrapper[4757]: I1216 14:26:59.114751 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vqg55" Dec 16 14:26:59 crc kubenswrapper[4757]: I1216 14:26:59.115109 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vqg55" Dec 16 14:27:00 crc kubenswrapper[4757]: I1216 14:27:00.163728 4757 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vqg55" podUID="bf1aeed3-7036-4537-b715-5c28a3e775be" containerName="registry-server" probeResult="failure" output=< Dec 16 14:27:00 crc kubenswrapper[4757]: timeout: failed to connect service ":50051" within 1s Dec 16 14:27:00 crc kubenswrapper[4757]: > Dec 16 14:27:09 crc kubenswrapper[4757]: I1216 14:27:09.164282 4757 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vqg55" Dec 16 14:27:09 crc kubenswrapper[4757]: I1216 14:27:09.213021 4757 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vqg55" Dec 16 14:27:09 crc kubenswrapper[4757]: I1216 14:27:09.403468 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vqg55"] Dec 16 14:27:10 crc kubenswrapper[4757]: I1216 14:27:10.398879 4757 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vqg55" podUID="bf1aeed3-7036-4537-b715-5c28a3e775be" containerName="registry-server" containerID="cri-o://5ff117904d07061adb3a3076de9afcdb3073ce3a28fb46ee3fd310934ef1c0f8" gracePeriod=2 Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.356531 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vqg55" Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.409408 4757 generic.go:334] "Generic (PLEG): container finished" podID="bf1aeed3-7036-4537-b715-5c28a3e775be" containerID="5ff117904d07061adb3a3076de9afcdb3073ce3a28fb46ee3fd310934ef1c0f8" exitCode=0 Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.409462 4757 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vqg55" Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.409492 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqg55" event={"ID":"bf1aeed3-7036-4537-b715-5c28a3e775be","Type":"ContainerDied","Data":"5ff117904d07061adb3a3076de9afcdb3073ce3a28fb46ee3fd310934ef1c0f8"} Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.410741 4757 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqg55" event={"ID":"bf1aeed3-7036-4537-b715-5c28a3e775be","Type":"ContainerDied","Data":"deef38df3b0e01025640e6ee322791d08b15435972c8e1aa59942c7a3a632737"} Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.410768 4757 scope.go:117] "RemoveContainer" containerID="5ff117904d07061adb3a3076de9afcdb3073ce3a28fb46ee3fd310934ef1c0f8" Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.434095 4757 scope.go:117] "RemoveContainer" containerID="e4e2d1bd4d2f47f1eb2035eca9f38d3a2ee248de5b7dcf689d58ab0ed2ce3816" Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.475339 4757 scope.go:117] "RemoveContainer" containerID="58290e001cdbcf1df7366eccf5bf3d1ce8775b10e93d812169f2ea9a8f587c55" Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.489103 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf1aeed3-7036-4537-b715-5c28a3e775be-utilities\") pod \"bf1aeed3-7036-4537-b715-5c28a3e775be\" (UID: \"bf1aeed3-7036-4537-b715-5c28a3e775be\") " Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.489543 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r729\" (UniqueName: \"kubernetes.io/projected/bf1aeed3-7036-4537-b715-5c28a3e775be-kube-api-access-4r729\") pod \"bf1aeed3-7036-4537-b715-5c28a3e775be\" (UID: \"bf1aeed3-7036-4537-b715-5c28a3e775be\") " Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.489819 4757 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf1aeed3-7036-4537-b715-5c28a3e775be-catalog-content\") pod \"bf1aeed3-7036-4537-b715-5c28a3e775be\" (UID: \"bf1aeed3-7036-4537-b715-5c28a3e775be\") " Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.490619 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf1aeed3-7036-4537-b715-5c28a3e775be-utilities" (OuterVolumeSpecName: "utilities") pod "bf1aeed3-7036-4537-b715-5c28a3e775be" (UID: "bf1aeed3-7036-4537-b715-5c28a3e775be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.495967 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1aeed3-7036-4537-b715-5c28a3e775be-kube-api-access-4r729" (OuterVolumeSpecName: "kube-api-access-4r729") pod "bf1aeed3-7036-4537-b715-5c28a3e775be" (UID: "bf1aeed3-7036-4537-b715-5c28a3e775be"). InnerVolumeSpecName "kube-api-access-4r729". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.509784 4757 scope.go:117] "RemoveContainer" containerID="5ff117904d07061adb3a3076de9afcdb3073ce3a28fb46ee3fd310934ef1c0f8" Dec 16 14:27:11 crc kubenswrapper[4757]: E1216 14:27:11.510334 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff117904d07061adb3a3076de9afcdb3073ce3a28fb46ee3fd310934ef1c0f8\": container with ID starting with 5ff117904d07061adb3a3076de9afcdb3073ce3a28fb46ee3fd310934ef1c0f8 not found: ID does not exist" containerID="5ff117904d07061adb3a3076de9afcdb3073ce3a28fb46ee3fd310934ef1c0f8" Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.510371 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff117904d07061adb3a3076de9afcdb3073ce3a28fb46ee3fd310934ef1c0f8"} err="failed to get container status \"5ff117904d07061adb3a3076de9afcdb3073ce3a28fb46ee3fd310934ef1c0f8\": rpc error: code = NotFound desc = could not find container \"5ff117904d07061adb3a3076de9afcdb3073ce3a28fb46ee3fd310934ef1c0f8\": container with ID starting with 5ff117904d07061adb3a3076de9afcdb3073ce3a28fb46ee3fd310934ef1c0f8 not found: ID does not exist" Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.510419 4757 scope.go:117] "RemoveContainer" containerID="e4e2d1bd4d2f47f1eb2035eca9f38d3a2ee248de5b7dcf689d58ab0ed2ce3816" Dec 16 14:27:11 crc kubenswrapper[4757]: E1216 14:27:11.510847 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4e2d1bd4d2f47f1eb2035eca9f38d3a2ee248de5b7dcf689d58ab0ed2ce3816\": container with ID starting with e4e2d1bd4d2f47f1eb2035eca9f38d3a2ee248de5b7dcf689d58ab0ed2ce3816 not found: ID does not exist" containerID="e4e2d1bd4d2f47f1eb2035eca9f38d3a2ee248de5b7dcf689d58ab0ed2ce3816" Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.510889 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4e2d1bd4d2f47f1eb2035eca9f38d3a2ee248de5b7dcf689d58ab0ed2ce3816"} err="failed to get container status \"e4e2d1bd4d2f47f1eb2035eca9f38d3a2ee248de5b7dcf689d58ab0ed2ce3816\": rpc error: code = NotFound desc = could not find container \"e4e2d1bd4d2f47f1eb2035eca9f38d3a2ee248de5b7dcf689d58ab0ed2ce3816\": container with ID starting with e4e2d1bd4d2f47f1eb2035eca9f38d3a2ee248de5b7dcf689d58ab0ed2ce3816 not found: ID does not exist" Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.510917 4757 scope.go:117] "RemoveContainer" containerID="58290e001cdbcf1df7366eccf5bf3d1ce8775b10e93d812169f2ea9a8f587c55" Dec 16 14:27:11 crc kubenswrapper[4757]: E1216 14:27:11.512087 4757 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58290e001cdbcf1df7366eccf5bf3d1ce8775b10e93d812169f2ea9a8f587c55\": container with ID starting with 58290e001cdbcf1df7366eccf5bf3d1ce8775b10e93d812169f2ea9a8f587c55 not found: ID does not exist" containerID="58290e001cdbcf1df7366eccf5bf3d1ce8775b10e93d812169f2ea9a8f587c55" Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.512118 4757 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58290e001cdbcf1df7366eccf5bf3d1ce8775b10e93d812169f2ea9a8f587c55"} err="failed to get container status \"58290e001cdbcf1df7366eccf5bf3d1ce8775b10e93d812169f2ea9a8f587c55\": rpc error: code = NotFound desc = could not find container \"58290e001cdbcf1df7366eccf5bf3d1ce8775b10e93d812169f2ea9a8f587c55\": container with ID starting with 58290e001cdbcf1df7366eccf5bf3d1ce8775b10e93d812169f2ea9a8f587c55 not found: ID does not exist" Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.592799 4757 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r729\" (UniqueName: \"kubernetes.io/projected/bf1aeed3-7036-4537-b715-5c28a3e775be-kube-api-access-4r729\") on node \"crc\" DevicePath \"\"" Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.592840 4757 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf1aeed3-7036-4537-b715-5c28a3e775be-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.625322 4757 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf1aeed3-7036-4537-b715-5c28a3e775be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf1aeed3-7036-4537-b715-5c28a3e775be" (UID: "bf1aeed3-7036-4537-b715-5c28a3e775be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.695104 4757 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf1aeed3-7036-4537-b715-5c28a3e775be-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.745678 4757 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vqg55"] Dec 16 14:27:11 crc kubenswrapper[4757]: I1216 14:27:11.768927 4757 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vqg55"] Dec 16 14:27:12 crc kubenswrapper[4757]: I1216 14:27:12.960684 4757 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf1aeed3-7036-4537-b715-5c28a3e775be" path="/var/lib/kubelet/pods/bf1aeed3-7036-4537-b715-5c28a3e775be/volumes" Dec 16 14:28:21 crc kubenswrapper[4757]: I1216 14:28:21.180877 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 14:28:21 crc kubenswrapper[4757]: I1216 14:28:21.181429 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 14:28:51 crc kubenswrapper[4757]: I1216 14:28:51.181622 4757 patch_prober.go:28] interesting pod/machine-config-daemon-tm6vt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 14:28:51 crc kubenswrapper[4757]: I1216 14:28:51.182148 4757 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tm6vt" podUID="43be7319-eac3-4e51-9560-e12d51e97ca6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"